It's been a long time coming, and the results probably won't satisfy all, but the answer to the perennial question "What is the cloud?" may finally be in reach.
The National Institute of Standards and Technology (NIST) recently published its 16th and final Definition of Cloud Computing in which it concludes:
cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
There. Glad we finally got that settled.
Of course, any time you try to distill a complex infrastructure like the cloud down to a few well-chosen words you leave a lot of wiggle room for those looking to capitalize on it. As TwinStrata CEO Nicos Vekiarides points out, the sheer scope and variety of cloud configurations leads to a number of key decisions when it comes to actual deployment. Not only are there public, private and hybrid designs to choose from, but many organizations are devising community clouds aimed at specific user sets of industrial sectors. Different cloud types may also require different premises equipment. Vekiarides, natch, prefers the enterprise storage gateway and the hybrid storage array, like TwinStrata's CloudArray.
It also seems that there are numerous components within the cloud architecture itself designed to service individual aspects of the data environment. Hitachi Data Systems has identified three cloud tiers in its latest model. At one level you have the infrastructure cloud, made up mainly of underlying hardware and the virtualizations stack. Then there is the content cloud, which resides mainly in storage and is tasked with keeping data and applications separate. Finally, there is the information cloud for handling large data sets. Under Hitachi's plan, you can build each one of these clouds individually as both your data needs and infrastructure evolve.
The type of cloud you deploy will depend greatly on the in-house infrastructure already at your disposal, says David Richardson of Emerson Network Power's Avocent division. Consequently, a thorough planning analysis should be your first step, focusing primarily on network connectivity, security and infrastructure capability. The last thing you want to do is introduce a set of services that, instead of boosting productivity, overloads critical resources instead.
It would be an overstatement to say that the quest for a working definition of the cloud has been a distraction from real-world issues surrounding the technology. A broadly acceptable basis of understanding is the first step toward a workable data environment.
Ultimately, however, it won't matter much if the agreed-upon definition includes words like "federated" or "dynamic" or any other prevalent buzzword. The primary task of the CIO is to devise an optimal data environment at low cost that can accommodate present and future needs.
If any given technology or system architecture can fill that bill, it shouldn't matter whether it's called the cloud or the sky or the murky black ocean.