Most of the talk surrounding the cloud these days centers on building the infrastructure, or transforming legacy systems, to support the new paradigm. It’s almost as if the cloud is like a shiny new car that does everything the old car does but with a new smell and no stains on the upholstery.
But a more accurate analogy is that moving to the cloud is like buying a car to replace a horse-drawn buggy. Not only is it faster and more comfortable, but it requires an entirely new skill set and a new fuel source to make it go.
In data environments, progress is measured in terms of application performance. And if there is one thing that cloud adopters have learned early on, it’s that simply porting legacy apps to the new environment is an exercise in futility.
The reason for this is clear, says Elias Khnaser, CTO of Sigma Solutions, in a blog for Virtualization Review. Traditional x86 applications assume that underlying infrastructure is stable and reliable in the configuration that the apps were designed for, so there is very little need to interact with that infrastructure on a regular basis. A cloud app can make no such assumption. It is thrown into a swirling mix of configurations and architectures, so it needs sophisticated coding that allows it to analyze its environment and craft the proper resource sets to enable optimal performance. Is it possible to migrate legacy apps to the cloud? Of course, but without the proper visibility and control, performance will suffer and costs will rise.
This is why app developers need to master four key concepts when it comes to writing for the cloud, says Rackspace’s Garreth Heath in a post for InfoWorld. First, you need to learn how to control hardware through code, most likely through the APIs made available by the host. Second, applications must be split into discrete services that can be utilized in a modular fashion by service-oriented architectures. As well, the app will need to interact with new container-style virtual platforms like Docker and Linux Containers (LCX), and finally, it will need to discover and utilize existing services without carrying lists of IP addresses or domain information.
This doesn’t have to be as daunting as it seems, says Dell’s Brandon Edenfield in a post for CIO.com. When incorporated into an overall modernization program, cloud-ready apps can be developed on par with server virtualization and cloud deployment efforts. The overriding goal in all of these developments is to shed the legacy, silo-based architectures that have hampered data performance for so long in favor of a more fluid, dynamic environment. Re-architecting apps to leverage virtual infrastructure makes it easier to tap into highly scalable cloud environments. At the same time, integrating legacy data sources into Big Data frameworks that are then deployed on the cloud removes the limitations of fixed storage and enables faster access and greater analytic ability.
Two key capabilities in this transformation are application performance monitoring and network optimization, according to Datapipe’s Richard Dolan. An app performance framework should include data collection, performance analysis and direct visibility into the cloud to ensure resources are being used efficiently while still providing optimal support. On the networking side, WAN traffic, available bandwidth and a host of other factors can inhibit performance even if the app layer and the underlying cloud infrastructure are in sync. A comprehensive approach to these issues is the best way to ensure that the enterprise is getting the maximum return for its cloud investment.
Obviously, infrastructure will continue to play a key role in data operations going forward. Success or failure will not be determined by the ability to manage that infrastructure, but by the ability to devise applications that can best leverage available resources in pursuit of optimal performance.
This is a more hands-off approach to IT management, but with a little trial and error it should produce an enterprise environment that is more in tune to the emerging needs of a data-driven economy.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.