It’s been an article of faith that the cloud is both cheaper and more convenient than traditional IT when it comes to addressing chronic resource shortages in the enterprise. After all, we wouldn’t even be discussing the technology if not for the fact that it provides such a high cost/benefit paradigm for data users.
But as the tech industry becomes more adept at building and maintaining cloud architectures, the realization that this is not always the case is starting to sink in. While there are many instances in which the cloud does provide both lower costs and greater flexibility, it is by no means a cure-all for everything that ails the enterprise. Now that most organizations have gotten their feet wet in the cloud, expect the coming year to focus on figuring out what works and what doesn’t.
Matt Prigge, systems architect at SymQuest Group, notes on Infoworld (registration required) that many enterprises are starting to take a hard look at the cost of their cloud deployments and are finding that the savings are quite often less than they had hoped. But that’s not necessarily a bad thing because even though the cloud may not offer the best price per byte, it still allows the enterprise to more accurately match resource consumption to requirements. For small workloads, then, this is effective because you no longer have to over-provision massive hardware infrastructure. Once data needs reach a certain point, however, on-premises infrastructure start to look more reasonable.
In many cases, the cost of the cloud extends beyond the cloud architecture itself. ZDNet’s Jamie Yap argues that SaaS is a perfect example. Sure, a simple SaaS program for basic enterprise applications is a very cost-effective approach. But if you want to integrate those applications into existing IT environments, you’ll need to adopt more cloud-like capabilities at home, which can require a significant investment depending on the current state of your infrastructure.
Of course, determining whether the cloud is truly saving you money requires fairly sophisticated measurement and analysis. And while most cloud providers offer their own management toolkits, there is a growing cottage industry among software developers intent on providing an independent view of what is happening inside public cloud services. Companies like Cloudyn and Scalr are betting that they can provide more effective means of lowering cloud costs and optimizing data flows than the cloud providers themselves. And in time-honored tradition, many of these developers are starting to mash up their offerings to provide full suites of cloud management, monitoring and deployment tools.
Ultimately, however, the IT industry will likely settle on an equilibrium between internal and external resources that provides the best data performance at the lowest cost, says Puppet Labs’ Teyo Tyree. This will probably consist of a different mix for each enterprise, based on user needs, legacy systems and other factors. But in the end, we are likely to see a broad homogenization of infrastructure built on a virtual architecture that allows organizations to mix and match various physical and logical platforms in the effort to drive down costs without sacrificing productivity.
In that way, then, the cloud is merely the latest phase in the ongoing struggle to derive greater value from the infrastructure investment. As the cloud becomes more of an integral part of data infrastructure, enterprise managers would do well to shift their focus away from simply “getting on the cloud” and more toward devising the most efficient and effective data environment.