The enterprise has been working out its cloud transition strategies for well over two years now, but it seems that many decisions regarding deployment and usage models are still being made blindly.
While it’s true that the lack of real-world production experience makes it difficult to judge how the cloud will function, it nevertheless seems as if the enterprise is ready to trust the cloud with all forms of data even though there is still no clear understanding of the basic characteristics of the technology.
Cost is a prime example. The common perception is that the public cloud is significantly less expensive than private clouds and provides greater scale and flexibility to boot. But a recent analysis by 451 Research suggests that the differences may not be all that dramatic. According to the group’s findings, an OpenStack private cloud distribution will run about eight cents per virtual machine per hour, just slightly better than a commercial platform like VMware or Microsoft. But both come in far less than the $1.70 per application hour that is common on the public cloud, or even the 80 cents per app hour available on Amazon’s Reserved Instances platform.
The caveat here is that the private cloud calculations do not include staffing and other factors that will bring the costs closer to parity. But at the very least they suggest that the public cloud is not a slam dunk when it comes to low-cost data infrastructure.
The sad fact is that it may be next to impossible to determine the true TCO of any cloud solution because of the constantly changing nature of the emerging digital infrastructure. As tech consultant David Linthicum pointed out on InfoWorld recently, multi-cloud architectures are becoming the norm and they are in a constant state of flux in terms of workloads, pricing structures and key features. And once the cloud starts to accept the critical data loads that are currently housed within the data center, expect costs to rise as much as 30 percent for the higher levels of service that will be required.
Many organizations are attempting to get a handle on their cloud costs through chargebacks and other kinds of tracking mechanisms that, by and large, are holdovers from the mainframe days. This works fairly well in the public cloud, says CIO.com’s Paul Gillin, but problems arise when private and hybrid clouds come online. With tools like dynamic, automated VM provisioning and resource assignment, it can be a management nightmare figuring out who is using what for any period of time. Software modeling can help, but they usually rely on cost estimates and other analytical tactics that make it difficult to foster efficient resource consumption among the workforce.
And all of this is before we add risk measurement into the calculation. As Host Review’s Florence Taylor notes, risk factors rise and fall with the type and scope of data under consideration, so the value of functions like security, privacy, compliance and availability will ebb and flow with the workload. As well, risk can be accentuated or lessened by connectivity solutions, portability, integration with legacy infrastructure and a host of other issues. And then there is the quality of the service provider under consideration, the level of vendor/provider lock-in involved and the transparency that various clouds provide into their inner workings. This isn’t to say you can’t have a low-risk environment in the cloud, but you probably won’t get it with the sub-penny per GB offerings currently making the headlines.
To be sure, calculating costs in the traditional data center was never an exact science either. But with profit margins in most industries having become so incredibly tight in the digital economy, even slight overruns in the enterprise can be enough to turn the monthly ledger from black to red.
While most cloud solutions tend to outperform and undercut their legacy data center equivalents at the outset, it is important to keep tabs on the price/performance ratio over the long term, because as everyone knows: Things can change quickly in the cloud.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.