Beyond the Public-Private Cloud Divide

    Slide Show

    Multi-Cloud 101: 7 Things You Need to Know

    The cloud is getting bigger, the data center is getting smaller, and it would seem these two trends are destined for one conclusion: migration of virtually all enterprise workloads to third-party infrastructure.

    But with such a broad and diverse IT ecosystem in the world today, is such an absolute transition inevitable? And is it reasonable to assume that while most of our data activities will move to the cloud, the really important stuff will remain behind the firewall, thus increasing the value of owned-and-operated infrastructure to a substantial degree?

    There certainly is no shortage of voices calling for an all-cloud infrastructure, even for highly regulated industries like banking and health care. As Stephen Garden of consulting firm CorpInfo told SiliconANGLE recently: “Any company that is launching today — they would never consider building a traditional data center.” Since these are the start-ups that are disrupting traditional industries with nimble, data-driven business models, it stands to reason that established firms should get on the bandwagon as well, before they are left in the dust. And indeed, Garden says many of his established clients are reaching the same conclusion: that the cost of building and maintaining on-premises infrastructure simply does not produce an adequate return.

    Contrast that with the views of Dana Epp, CTO of IT management developer Kaseya, who points out that hyperconverged, private cloud infrastructure is turning data center cost models on their heads. So while no one is likely to build an old-style, silo-based data center anymore, there are plenty of good reasons why many critical applications should stay on-premises where it is well within the possibility of providing flexible, dynamic resources at scale without blowing the budget. In Kaseya’s case, the company still operates two main data centers outfitted with technologies like Fusion’s Flash-based ioMemory modules that provide server performance that cannot be had in the public cloud at a reasonable cost.

    Indeed, says Fortune’s Barb Darrow, many smaller companies that begin as all-cloud entities quickly find out that the cost-benefit starts to break down once scale hits a certain level. Dropbox discovered this not long ago and has now migrated some 90 percent of its operations from Amazon to local infrastructure. And this itself points to one of the more salient risk factors in the cloud: With workloads and data requirements changing so frequently, the cloud that works in year one may not be the best solution by year three, which can lead to a costly and complicated migration in order to maintain an optimum data environment.

    This is part of the reason why debating between public and/or private clouds misses the point, says WHIR’s Bill Kleyman. What’s needed is an overarching cloud strategy that stresses key operational requirements like resiliency and availability. In this way, business objectives take precedence over infrastructure and technology, allowing IT and business-line executives to define parameters like security, backup and interoperability according to what is needed rather than what is available from a given set of resources. Whether the provider is a third party or the enterprise itself, organizations should stress flexibility and fully developed feature sets before placing any application or data onto newly provisioned infrastructure.

    As the cloud matures, expect the lines to blur between public, private and even hybrid infrastructure. The entire enterprise stack is migrating to a service-based work environment sitting atop software-defined infrastructure, so the exact location of data resources will not be as important as performance and risk mitigation.

    The best solution will be the one that works, not the one that is close to home or out in the field.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles