Adapting to new technology is a time-honored enterprise tradition, and at times it seems that IT is just as addicted to trends and fads as the average teenager. But at some point, the latest gizmo needs to fit into the larger picture of expanding data capabilities and lowering costs in order to improve the overall data experience for the business.
For many organizations that have embraced the cloud, that point is approaching quickly. So if you don’t have a working strategy to guide your cloud’s expansion and overall utility by now, you’ll need to get one quickly.
The biggest danger in not having an effective cloud strategy is the rise of disjointed, disconnected resources that simply compound the problem of silo-based architectures within the data center. As tech blogger Andre Leibovici notes on myvirtualcloud.net, public clouds in particular are still unable to deliver the kind of stability, availability and security that enterprises require, and if you are not careful, you could end up spending more for these services than it would cost to deploy them in-house. So at the very least, the enterprise needs to perform a clear assessment as to which applications and workloads are best suited to the cloud and which ones need to stay close to home.
Indeed, says Gartner’s Thomas J. Bittman, the best thing to do now is to identify your digital requirements – both for today and in the future – and then select the right mix of public/private/hybrid and SaaS/PaaS/IaaS architectures that meet those needs. Not all cloud solutions are the cheapest for all applications, so if reducing costs is your goal, you’ll need to know what type of cloud works in any given circumstance. Flexibility and greater user autonomy over infrastructure and resources is also a key goal, but there is a fine line between empowerment and anarchy. In short, the cloud can do many amazing things, but it still must fit within the overall data framework that provides the best cost/performance ratio.
None of this can happen without adequate benchmarking, says Gigaom’s Paul Miller. The cloud is a highly dynamic environment with wide variations in price and performance between competing public providers and local private solutions, and there is a clear lack of consistency in the specifications that solutions providers use to draw comparisons with one another. So the best thing to do is test real-world applications across multiple platforms in order to gain a comprehensive view of the performance characteristics experienced by specific workloads. And remember, just because one cloud outperforms another in one scenario doesn’t mean it will outperform it in all cases.
As well, adopting the correct cloud posture will depend largely on the state of your existing infrastructure, says Tangoe VP Russ Loignon. As he explains to Tech Radar, everything from the nature of server, storage and networking hardware and software, to the degree to which you have embraced other technology developments like mobility and collaboration should factor into the decision-making process. Certain functions, namely security but also management and governance, must permeate the entire data stack, especially if it extends beyond the data center walls. That means minute details about provider infrastructure, such as the use of shared ports and multitenant resources, can have a big impact on key metrics like performance, reliability and security.
The good news is that many organizations are starting to take their cloud deployments seriously, bringing the Wild West aspect of the industry to a close. Going forward, the novelty and excitement that has fueled the cloud will diminish, but its value as a data platform will increase.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.