The vision of the cloud as a magical realm of limitless scalability and customized, on-demand data architectures still runs strong in the enterprise industry. This view is not altogether wrong, even though many clouds with various levels of functionality will be created in order to meet the demands of an increasingly diverse data community.
But no matter how the individual enterprise chooses to implement the cloud or what applications it deploys, the fact remains that, as with any other infrastructure expansion, the migration process will be lengthy and complicated.
The good news, though, is that the cloud industry is highly motivated to absorb as much of the existing enterprise data environment as possible and, being already steeped in automated processes, it is working to take on the lion’s share of the migration burden using the latest software platforms.
Microsoft, for example, is returning to its roots – the operating system – as a means to first ease the transition from traditional infrastructure to the private cloud and then to foster greater reliance on hybrid and public resources as the comfort level with distributed resources increases. The new Cloud OS combines elements from Windows Server, Windows Azure, System Center, Hyper-V and other tools to provide a common platform for infrastructure, applications and data as they make their way across the data center, the Azure cloud and even third-party service providers. The platform leans heavily toward application and data portability, through multi-hypervisor management capabilities, service automation and rapid storage and network provisioning.
Cloud migration has also produced a cottage industry of sorts, made up of people who specialize in the often intricate dance of implementing new cloud solutions without interrupting normal business operations, or at least keeping the interruptions as unobtrusive as possible. One of these, Cloud Technology Partners, has even gone so far as to make its process available as a general purpose SaaS solution. Dubbed PaaSLane, the system provides source-code scanning on Java and .Net applications and then compares them to target cloud environments to identify patch inconsistencies, OS conflicts and other issues. While these problems still must be remedied by hand, PaaSLane at least allows the enterprise to determine which applications and data sets are ripe for which cloud deployments and can even point out potential source-code changes and other upgrades to make apps more suitable for the cloud.
Cloud providers themselves are also teaming up with software developers to ease the migration process. California’s Nebula, for instance, recently joined forces with GigaSpace Technologies to provide an OpenStack-based private cloud solution that can be deployed within a day and begin immediate app migration without code or architectural changes. The package uses GigaSpace’s Cloudify platform to create what the company calls “application recipes” to automate setup, monitoring, repair and scale of mission-critical and Big Data applications like Hadoop, Cassandra and ElasticSearch, as well as database platforms like MySQL and PostgreSQL. A DevOps automation tool is also available to enable customized processes for the Nebula One cloud.
But no matter how automated the cloud migration process becomes, expect disruptions and outages to continue, says Cetan Corp.’s Mike Aquino. As he explained to CRN recently, even the well-laid plan is subject to human error and unforeseen circumstances, which is why it is necessary to have a back-up plan and adequate system redundancy to quickly step in when the inevitable occurs. As in any major project, preparation and due diligence go a long way toward smoothing out the wrinkles, but in the end no process is fool-proof.
None of this, however, seems likely to daunt the eagerness with which the enterprise will embrace the cloud. Even the worst migration is but a temporary inconvenience in the transition to the new data paradigm. And after all, you can’t make an omelet without breaking a few eggs.