Monthly Archives: March 2009

Cloud Computing: New Stuff or Legacy Revisited?

There’s a lot of information hitting the bitwaves touting the value of cloud computing and it seems that within the IT industry those that profess positive opinions of an emerging market during a hype cycle seem to get more attention than those with negative or pragmatic opinions on the issue. Thus, the hype cycle reinforces itself and the hype grows larger. With regard to Cloud Computing, I’ve commented before on the two leading opinion groupings–the Gray Hairs and the Newbies. The Gray Hairs look at Cloud Computing as time-sharing redux and the newbies look at it as the only way computing will be done within the next 20 years. Ho hum!

Let’s look at what is the same about Cloud Computing:

1. Outsourced hosting – IaaS, HaaS, etc. are just fancy new ways to talk about a model that has been employed for many years now. Effectively, the Cloud provides an economic way to leverage shared computing resources for an economic benefit.
2. Hosted applications – SaaS is just a new way to talk about application service provider (ASP) model, which has been in operation since the 1960’s.
3. Disaster Recovery – Even those that have built their own data centers often contract with companies that maintain the necessary hardware and operating systems to enable continuance of operations in the event of a disaster.

Now, let’s look at what’s new about Cloud Computing

1. Scale – due to the massive reduction in the costs of memory, storage and CPU, we can now offer unprecedented linear scaling.
2. New distributed computing architectures – projects like MapReduce and Hadoop leverage our ability to scale linearly and turn it into an ability to scale exponentially. Linear scale still suffers from an inability to have a 1:1 mapping between storage and memory, thus the speed to read a gigabyte of information limits many of the scaling benefits. Architectures like Hadoop and MapReduce parallelize the access to the data across multiple processors and multiple data stores dramatically reducing this limitation.
3. Virtualization – the ability to leverage a single machine architecture to host multiple operating systems changes the DR costs significantly. In many cases open systems DR costs can be cut by a 1/3 of what they costs prior to mainstream inexpensive virtualization and the development of the hypervisor.
4. Metering – in IT has typically driven by transactions or fixed fees, but, the Cloud treats resource usage like telecommunications industry treats data and voice by billing based on usage, which allows for more affordable pricing models

So, Cloud Computing takes advantage of some of the advances made in computing and combines it with ideas and business models that have been working for the past forty years.

Contrarian or Realist?

On the last podcast with Dana Gardner, which will be released shortly, Dana commented on my oft contrarian views on information technology. At the time, I took it as a compliment, but it definitely gave me something to consider about myself — am I a real contrarian for contrarian-sake or am I just an experienced realist?

With all due respect to the analyst community, many are fine researchers, but even those who were practitioners once have been out of the game for some time. Whereas I have taken great measures to ensure that I never have gotten too far away from the actual implementation work. I believe the combined view from both inside and outside the IT universe provides a well-rounded view and one that is full of realism.

Perhaps my analyst counterparts know this, but fear being the messenger, because we all know the messenger gets shot. However, I have always prided myself on “calling it as I see it” and I don’t see that changing in the near future.

So, what exactly am I contrarian about? Here’s the short list:

1) I believe open source has obliterated the infrastructure software market and this will have negative implications within the next 5 to 10 years
2) I believe businesses will move away from instead of toward greater software quality due to the higher costs and the appearance that it should be cheaper created by a self-serving VC-backed contingent
3) I believe that containers and application servers are responsible for the creation of a entire portfolio of applications that rely too heavily infrastructure and cannot be properly instrumented and monitored. Moreover, I don’t believe you can know everything that is required about the operations of the application running inside the container by watching the container. It’s like watching a building from the outside and expecting to know what all the people inside are doing.
4) I love learning new programming languages, but I believe the plethora of them make it difficult for businesses to manage their portfolio of applications as an asset
5) Feeding on #4, I don’t believe businesses do enough to manage their in-house applications as a proper asset
6) I believe architecture is essential for developing high-quality systems and I also believe that this point is so intangible that it will never be provided appropriate funding from within businesses

That’s just a short list. There are probably more, but the idea was to put enough out there to gain feedback. Is this contrarian or realism?