One of the problems with developing entirely new data architectures like the cloud is that no one has a clear idea of the end game. Just about everyone these days wants to be on the cloud, but we are still struggling to define what, exactly, “the cloud” is and how to implement it.
Indeed, the schism between the public and private camps is as strong as ever, with the former describing private clouds as nothing more than automated virtualization, while the latter describes over-reliance on public resources as a recipe for disaster. And if you prefer hybrids? Well, you must be completely hopeless.
Lately, however, some voices are raising the possibility of a compromise. Rather than a simple black-and-white view of the cloud, perhaps there could be numerous shades of gray.
Take HP’s Christian Verstraete, for example. His take is that, sure, most private clouds feature virtualization and automation, but also provide elements of self-provisioning, on-demand resource delivery and broad scalability—if not right away, then certainly when the private cloud infrastructure is sufficiently developed. In this way, a private cloud provides an OSSM (on-demand, scalable, self-provisioned and measured) environment, which qualifies as a basic definition for the cloud. For those who argue that true clouds must include third-party infrastructure, that’s like saying the color of the sky can be only true blue.
But if the cloud cannot be clearly defined, how will you know if you have one? According to Datalink’s Kent Christensen, many organizations start out building a private cloud with the best intentions but get sidetracked along the way. Among the key pitfalls is the belief that the cloud will produce a magical world of unlimited data and resources with little or no oversight, and that the needs of all business units and users will be met with a few clicks of the mouse. The reality is quite different. Based on past experience, the conversion process is usually complicated, slow, and subject to the same resistance to change that has plagued just about every other technology innovation to have hit the enterprise.
Another disconnect that those engaged in the private vs. public argument fail to make right is that both architectures serve very different roles in the enterprise. It’s akin to buying your underwear but renting your tuxedo, says Joshua McKenty, CEO of Piston Cloud Computing. The private cloud (your underwear) is reserved for your most private areas, while the public cloud is the spiffy attire you display to the world. The private cloud can be as fancy and exotic as you like, but it will almost always play a hidden role that is nonetheless crucial for the smooth functioning of the enterprise organism.
But it is also true that even among private clouds, no one “right” answer exists. According to Forrester, private clouds can take many forms depending on their purpose. These include enhanced virtualization for general workloads, abstracted software layers for test/dev applications, public cloud-style internal infrastructure for rapid provisioning, and full-blown transformation clouds that will sweep away most legacy infrastructure. Pursuit of each of these models will depend very much on specific goals and the willingness of stakeholders to alter their current working environments.
This argument over what is and is not the cloud has been going on so long that many IT professionals consider it the natural order of things. But one has to wonder if getting caught up in building “a cloud” is relevant at all to the true needs of the enterprise.
At the end of the day, solutions matter, not names. Regardless of whether someone has placed the word “cloud” in the title, every proposal should meet the same criteria: Does it work? Does it solve problems? Can it be implemented quickly and easily? Does it cut costs or increase revenue/productivity?
If it can meet these and other requirements, it shouldn’t matter if it does not conform to someone else’s idea of the cloud.