Monthly Archives: February 2011

Scale is the Common Abstraction of Cloud Computing

In my experience, for there to be widespread adoption and agreement over a concept, the semantics surrounding that concept must be unique and discrete. If the definition is too broad, there will be too much opportunity for dissenting opinion. NIST’s definition of cloud computing doesn’t really answer the mail when it comes to delivering a definition for cloud computing. In my opinion, it’s nothing more than bucket of attributes that have been previously associated with cloud computing; and attributes do not constitute meaning. To define cloud computing concretely, in a way that can be agreed upon by a majority, we must identify that common abstraction that binds the definition in a unique way.

One thing the NIST definition does well is aggregate the most common elements that people associate with cloud computing today. The characteristics, delivery models and deployment models offer up data that can be analyzed to find that common abstraction that will help foster the definition the industry needs to differentiate cloud computing from other elements of computing that are similar. We need to answer the looming question, “what makes cloud computing different?”

I will not be attempting to put forth my recommendation for a definition in this blog entry. Instead, I am merely attempting to get to a common abstraction can make sense of why these characteristics, delivery models and deployment models make sense together.

For example, without my discovery of a common abstraction, I questioned why Software-as-a-Service (SaaS) is a legitimate aspect of cloud computing. I can clearly see how Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) relate to the essential characteristics as they have direct impact on providing elasticity and scalability. However, the relationship of SaaS to these other two seems contrived and vestigial; a left over from the “cloudy” (pun intended) formation of cloud computing. Indeed, without the common abstraction, SaaS is nothing more than the Application Service Provider (ASP) model renamed. Morevoer, SaaS clouds (yes, pun intended again) understanding since SaaS pertains more to “how a service is delivered” than “what is in the service offering”. Furthermore, the concept of SaaS doesn’t immediately translate well when our focus is on private over public cloud computing. That is, an application served up in a private cloud is typically not considered SaaS.

Of note, I would propose that if we’re moving toward IT service management and delivering IT-as-a-Service (ITaaS), then any applications served by up by IT, be it public or private, should be considered as SaaS. But, this is a debate for another day, the key focus here is to find an common abstraction that allows the definition of cloud computing to hold together regardless of the deployment model.

So, to cut to chase, for me, the common abstraction that allows the NIST proposed definition to stand as a legitimate grouping of attributes is scale. What is different about cloud computing compared to other computing models? Cloud computing focuses on making scalability work at very large scales as part of delivering computing within a service model context. IaaS provides scalability of compute, storage, memory and network resources; PaaS provides scalability of application infrastructure; and SaaS provides scalability of users and licensing.

For years we have been able to scale linearly, but without infinite funds, space and power, eventually, the ability for a single entity to scale will reach a point of saturation. At that point, to continue scaling would require assistance from external partners. However, without a framework and architecture to facilitate extending their own resources into that of their partner’s the entity would still be at a loss to achieve their goal and continue to scale. Cloud computing provides us with that architecture and framework to reach beyond our own limits for scalability.

Perhaps others can find other common abstraction that meet their needs in this regard, but for me, defining cloud computing requires us to answer what is the key differentiator between cloud computing and other computing models e.g. client/server, distributed object computing, mainframe, etc. For me, that key definition is the ability to eliminate the saturation points of scaling computing resources.

The Game is Always Changing: A Non-technical Perspective on Cloud Computing

It’s not about what’s different, it’s about getting on board with the changes in the game or get left behind.

People say, “What’s so different about Cloud Computing?”, “This is nothing more than managed hosting or mainframe computing redux”, “what’s old is new again”.  What was it that Gordon Gekko says to Jake on the subway in “Wall St.: Money Never Sleeps?” oh yeah, “a fisherman can always see another fisherman coming.”  Technically, Cloud may be perceived as only incremental changes over what already exists, but those that argue that Cloud is just hype have already missed the bigger picture—the game is always changing and this is the next major change.

In this case, the game has shifted from the infrastructure and applications matter to data matters.  The players driving the Cloud revolution have initiated the path to commoditizing the rest of the computing universe and all you should care about is the data and the services that operate on that data.  Computing hardware was already heavily commoditized, but, up to a few years ago software was still the realm of the wizards.  With the emergence of cloud computing delivery models—IaaS, PaaS & SaaS—consumers now have value-based vehicles to select from to support their various computing needs from do-it-yourself (DIY) to do-it-for-you (DIFY).  This was a realization that just occurred to me in this past week; cloud computing brings IT into the realms of Pep Boys and Home Depot.

Those of us that come from engineering backgrounds often miss the subtle drivers that move the universe and may even question how these changes come about.  Let’s face it, you can’t talk about this scenario without referencing VHS vs. Betamax.  There are a group of engineers that have been employed by the market makers to bring about this revolution.  I regard this move as the chess players moving the pawns on the board.  These market makers see a world that they could have a piece of if they could bring about the vision in their head, so they employ smart people to push the bounds of what can be done and raise the bar to new extremes.

Do you think the Amazon Web Services cloud is perfect?  It’s a work in progress.  I guarantee if you check service levels within a large-scale deployment against current levels within well-staffed, well-managed data centers, you will see more failures in AWS then you have in the past five years in the data center; and that’s okay because we’re in the midst of the game changing.  I had a CEO in one my startups that used to say, “by the time we have 50 customers our first 5 customers are going to hate us and leave us.”  The reality of this is that that group, no matter how special you treat them because they were first are going to live through significant pain working with you through a game change.

What’s really interesting about this change is the support from the public sector.  What traditionally has been a laggard in information technology, in my opinion, is turning out to be one of the strongest driving forces behind this game change; this really turns the concept of market makers on its head.  After all, it’s easy to comprehend where the game is going when Bill Gates, Larry Ellison, Jeff Bezos, etc. are the market makers, but considering that U.S. Federal CIO is also part of that group means that the scale of the market driving the change is considerably larger than anything we’re used to seeing.

While maybe not readily apparent, because market makers are primarily focused on the private sector, direction is most often not influenced by acquisition policy, congressional politicking, military and intelligence confidentiality requirements, or Presidential races.  The landscape of this market once filtered through the lenses of these and other factors will have grown significantly faster and offer a great deal more variation and value.  More importantly, and we are seeing this happen already, it will be more open and transparent than past market changes.

So, technically speaking, it doesn’t matter if Cloud really offers a revolutionary change over what we already had because in the minds of the masses, the change is real and it’s happening now.  The question is, will you figure out your role in the changed game or believe the game change will fail leaving you unchanged?

Analysis of U.S. CIO Federal Cloud Computing Strategy

Yesterday, February 8, 2011, the Office of the U.S. Chief Information Office released its Federal Cloud Computing Strategy.  In no uncertain terms, this strategy is focused primarily on one key factor—deliver more value for the same dollars spent.  Semantically, to me, this is a very different statement than establishing this strategy around lowering IT costs.  That is, according to what I can discern, it is not the intention of this document to promote the use of Cloud Computing to reduce the overall spending on IT (at least not at this time), although, one would hope that would be a positive side-effect, but instead, use the same allocated funds on Cloud Computing with the intent of receiving a greater value for that same spend.

This small, and perhaps arcane, difference between costs reduction and greater value is of critical importance when measuring success for this administration in promoting a Cloud First strategy.  Depending upon which pundit/analyst you follow or hear talk you may hear a different list of benefits from Cloud Computing.  Certainly, the list of benefits should be different for public and private sectors and, within public sectors, each agent has its own mission adding even more variance into the list of benefits.  However, this document transcends those variances in a way that allows all agents to participate while limiting overall objections.  Hence, this document, if nothing else, accomplishes a monumental feat of delivering some practical vision while limiting the opportunity for the ideas and concepts to easily be dismissed.

Additionally, this document also serves as a model that even the private sector can follow for adoption of Cloud Computing.  It serves as an example for establishing a solid foundation that includes a business justification, addresses security concerns, provides a decision framework for identifying appropriate services to move to the Cloud, illustrates successful use case examples, and establishes a model for governance.  The latter is a key and often overlooked requirement for developing successful Cloud Computing strategies.  The governance foundation section clearly identifies the role of various departments and agencies with regard to leading direction for Cloud Computing adoption in the U.S. Federal government.  They are defined as follows:

  • National Institute of Standards and Technology (NIST) will lead and collaborate with Federal, State, and local government agency CIOs, private sector experts, and international bodies to identify and prioritize cloud computing standards and guidance
  • General Service Administration (GSA) will develop government-wide procurement vehicles and develop government-wide and cloud-based application solutions where needed
  • Department of Homeland Security (DHS) will monitor operational security issues related to the cloud
  • Agencies will be responsible for evaluating their sourcing strategies to fully consider cloud computing solutions
  • Federal CIO Council will drive government-wide adoption of cloud, identify next-generation cloud technologies, and share best practices and reusable example analyses and templates
  • The Office of Management and Budget (OMB) will coordinate activities across governance bodies, set overall cloud-related priorities, and provide guidance to agencies
  • All-in-all, this document is a highly-reusable piece of literature that is difficult to find fault with.  I will admit I was expecting this document to be a fluff piece that offered

All-in-all, this document is a highly-reusable piece of literature that is difficult to find fault with.  I will admit I was expecting this document to be a fluff piece that offered lofty goals and praised Cloud Computing simply because it’s “the hot buzzword of the month”, but one must give credit where credit is due; and in this case much credit is due to Vivek and team.  Good on ya!