Keeping Control of the Hybrid Enterprise

    Hybrid computing models are starting to infiltrate enterprise data environments as organizations seek to leverage both public and private cloud infrastructure. But while this may seem to diminish traditional in-house data centers, it’s actually the outsourcing industry that has reason to worry.

    According to Gartner, hybrid infrastructure will feature prominently at 90 percent of data-driven organizations by 2020, leading to a nearly three-fold increase in the cloud computing market to $68.4 billion. At the same time, spending on data center outsourcing (DCO) is expected to contract from today’s $55.1 billion to $45.2 billion. At the moment, DCO and infrastructure utility services (IUS) make up about half of the $154 billion data center services market, but this is expected to drop to a third by 2020 as hosting and cloud-based IaaS models gain in popularity.

    What this means is that while organizations continue to reduce their direct management of physical-layer infrastructure, they will reassume control of their higher-level data and services architectures. But this transition is not without its challenges. A recent study by 451 Research noted that management aspects like cost containment, data migration and security are top concerns in the hybrid cloud, and are producing the most divergent responses. Some organizations, for example, pursue multi-vendor strategies to address these difficulties while others say they have greater success with single-vendor solutions. As well, hybrid cloud adoption is being driven by distinct challenges within vertical industries and national boundaries, with some organizations vexed by erratic user demand while others are faced with limited compute and storage capacity.

    In many ways, these issues are the same as in the traditional data center, says Informatica’s Greg Hanson. Companies are constantly looking for the greatest level of agility in order to confront emerging business challenges, so the overriding concern in any data environment is how to get systems and architectures to work together effectively. Unfortunately, many cloud-facing organizations are responding to this challenge by recreating the same data silos that inhibit operations in the data center, only this time the data infrastructure is dispersed over the wide area and is usually managed by business units that are driven by parochial concerns. This makes it imperative for IT to maintain a central view of all clouds, which should be easier to do than in the data center considering cloud silos are abstract, virtual constructs rather than defined by hardware.

    This, in fact, is driving the deployment of the software-defined data center (SDDC), which theoretically can occupy either local or third-party clouds depending on the needs of a given application or service. ZDNet’s James Sanders recently pointed out that as data resources become increasingly dispersed across virtualized infrastructure, SDDC management stacks will emerge as the primary means of corralling this hodgepodge of connected systems. Companies like Cisco and VMware are already reducing their reliance on traditional management systems in favor of SDDC platforms like ACI (Cisco) and the EVO SDDC Manager (VMware), while open solutions like OpenStack are evolving along hypervisor- and even container-level automation.

    All of this is leading to a world in which physical location of data resources won’t matter all that much. What counts is performance, which will largely be defined by things like agility and scale rather than raw computing power.

    The enterprise finds itself in the enviable position of being on the cusp of a truly responsive, highly adaptable data environment, but this will require a rethinking of the relationships between resources, applications, data and processes.

    In the future, then, the enterprise will be able to outsource its data infrastructure but exert greater control over its data operations.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles