Speed, flexibility, efficiency and lower operating costs — these are the primary strengths that the cloud brings to the data infrastructure table. Most providers emphasize these attributes in their argument that the cloud provides a better, cheaper alternative to the traditional data center.
However, this glosses over the fact that the cloud offers the more tantalizing possibility of remaking longstanding data infrastructure from the ground up — adding features and capabilities that can’t even be imagined under static, silo’d architectures.
A case in point is the disconnect between existing environments, even virtual ones, and the applications they support. Application optimization has been a key goal for many IT departments, but at the end of the day, it usually requires applications to be force-fit into legacy infrastructure. As the cloud evolves, however, it’s becoming possible to envision a world in which operators, or the applications themselves, will be able to configure customized environments for individual functions — everything from processing, storage and networking to databases, middleware and ancillary services.
This type of broad, dynamic orchestration has been talked about for some time, but only lately has it matured to the point where we can start talking about production enterprise environments. One of the first issues to come up is what sort of cloud is right for the application at hand. As Logicworks pointed out recently, web and Big Data application performance can be unpredictable on public clouds, depending on how virtual machine configuration and other attributes affect network resources. Private and hybrid clouds offer more control, but not when it comes to adding or scaling resources to match data workloads.
But orchestration extends way beyond the type of cloud architecture. As OpDemand notes, today’s systems can automatically track application-resource dependencies across server, quasi-server and non-server infrastructure. Through configuration fields and other environment variables, applications can not only be deployed in the most optimal environment, but can more easily coordinate activities with each other through the mutually beneficial architectures that can be provisioned and dismissed as needs arise.
Indeed, improved orchestration represents the next major goal among traditional infrastructure developers as they seek to maintain their relevance in the cloud. IBM, for example, has put the technology front and center in its effort to transition from infrastructure platform developer to an integrated hardware/software/services provider. By supporting the OpenStack format in its SmartCloud Orchestrator and other systems, the company is setting itself up to play a lead role in designing and delivering coordinated cloud environments even as the overall market continues to evolve and expand in unpredictable ways.
Another example is VMware, which has long known that it must come to grips with the fact that even the server virtualization market will top out someday and it will need a strong foothold in the cloud if it is to avoid obsolescence. Through support for software-defined networking (SDN) and advanced orchestration techniques in its vCloud platform, the company is pushing a vision of the software-defined data center (SDDC) in which full operating environments are built, employed and decommissioned completely in logic. Part of this plan involved building out its own public cloud, the vCloud Hybrid Service, that clients could use for burst services and other functions. However, it’s unclear how this will sit with VMware clients who happen to be cloud providers themselves.
The cloud is the biggest game-changer to hit the IT market in, well, ever. But its true value is not that it provides a better way to build the same old data environments, but in completely re-imagining long-standing relationships among users, data, applications and infrastructure.
In this universe, users, not infrastructure, get to dictate what is and is not possible.