As the enterprise becomes more steeped in the cloud, greater attention is being paid to the real costs of moving workloads onto third-party infrastructure.
The prevailing attitude is that the cloud is cheaper than on-premises in just about every circumstance, and by a wide margin. But is this really true? And does that mean we don’t need to run the same cost-benefit analysis in the cloud to make sure we are getting an optimal return on our investment?
After a brief respite, it seems that the price-cutting has resumed among the top public cloud providers. AWS and Google both announced price cuts shortly after the new year, and now Microsoft is following suit for its Azure service. All three are playing fast and loose with the cost basis, however, as they usually revolve around service bundling, machine categories, automated tiering and a host of other factors that can cause prices to fluctuate wildly.
Despite these moves, however, it appears that cloud pricing is starting to stabilize, at least compared to the mad race to the bottom in 2014 and early 2015. According to Tariff Consulting Ltd. (TLC), costs are down about two-thirds over the past two years so that an entry-level Windows instance can now be had for about 12 cents per hour. Part of this is due to the reduction in the number of pricing tiers that had cropped up earlier, and largely served to confuse users over the deal they were getting. But it also has to do with the rise of private and hybrid clouds, which tend to foster enterprise-based buying decisions, as opposed to individual deployments, and these are driven more by integration and operational factors rather than price.
This leads us into the murkier realm of divining real value from the cloud, not just comparing prices. If one service is cheaper than another but fails to provide the appropriate application and user needs, is that money well spent? Not by a long shot, says Forrester’s Sophia I. Vargas. In fact, the more familiar the enterprise becomes with the cloud, the more evident become the uncomfortable truths.
First, variable pricing that matches resources with workloads is usually the best cost option, although more difficult to maintain and track. Second, internal support costs do not disappear just because the cloud is in play. Third, cloud providers have an incentive to break up large monolithic apps across multiple virtual machines, which means they get higher utilization while you get poorer performance. Then there are the myriad hidden costs related to licensing changes, error rates and a host of other factors. And in the end, it’s not really the cost of the cloud that matters anyway, it’s the speed and agility, which is much harder to quantify.
The good news is that the enterprise is being empowered with new tools to track and analyze cloud usage. Platforms like Cloudability are bringing advanced visibility and number-crunching to cloud deployments to give organizations a better idea of where and how their data is being handled. Cloudability recent acquired self-service BI platform DataHero that offers not only the ability to manage cloud usage but to predict resource consumption and utilization as conditions change. The last thing any enterprise needs is to become largely or even wholly dependent on the cloud and then suddenly realize that the cost structure is not what it first appeared to be.
So let the cloud giants play their pricing games. Ultimately, the cost per VM is of little consequence to an enterprise that is venturing into the cloud for the right reasons and with the right set of expectations.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.