Amid all the year-end stories and commentary highlighting the way the cloud is changing enterprise data environments, one key factor has been overlooked: The cloud itself is evolving, further complicating the long-held relationships between users, the enterprise and data itself.
On an infrastructure level, new types of clouds are continually coming on line. Most providers, for example, pride themselves on their scale-out capabilities, offering clients ever greater numbers of virtual machines to handle increasingly complex data loads. Recently, however, firms like ProfitBricks have turned their attention to scale-up models that provide smaller numbers of larger VMs designed for higher-level number crunching and Big Data analytics. This vertical scaling approach is said to be more efficient and provides greater support for in-house data infrastructure by enabling more precise service-level choices and “virtual whiteboarding” for greater component customization.
The cloud is also playing a major role in the ongoing consolidation of data center infrastructure, according to Pacific-Tier Communications’ John Savageau. At a time when large portions of existing IT infrastructure are reaching the end of their lifecycles and users are clamoring for greater collaboration and interoperability, the cloud is making it easier to shed inefficient systems even while it enhances the ability to consolidate data and applications into a single overarching framework. The end result is that as infrastructure becomes more cloud-like both in and out of the data center, the enterprise will see greater productivity and higher data utility.
The cloud is also beginning to shed its reputation as a largely shared infrastructure platform by providing more private services, says FuseApp’s Mike Maughan. Through virtual private suites, cloud providers enable a physical and logical private cloud to their clients that allows them to retain complete control over the infrastructure and full access to hypervisor-layer resources. In this way, the enterprise can scale its overall cloud presence up or down according to user and data needs. The concept is drawing high interest from mid-size enterprises in particular, who see it as a way to quickly build out private cloud capabilities without substantial up-front capital investment.
But the farther along we get to the fully automated “software-defined data center,” there is a danger that putting data environments on auto-pilot will have a detrimental effect on resource allocation, operational efficiency and other performance issues. Data loads that are suddenly shifted to resources that do not have adequate power, cooling and other physical functions can bring operations to a halt pretty quickly. But this is where the rising field of Data Center Infrastructure Management (DCIM) will come in handy, according to on365’s Chris Smith. By moving facility-level systems to an on-demand footing and then tying that to data automation tools, enterprises and cloud providers can both ensure that SLAs can be clearly defined and maintained even in highly dynamic environments.
It’s tempting to look at state-of-the-art technologies as the culmination of the evolutionary process. But whether we’re talking about technology or biology, evolution is proven time and again that it is perpetual. The cloud, then, is simply part of this on-going process, not the focus, which means that as soon as we are safely ensconced in the new data paradigm, the next big change will be right around the corner.