More

    Which Cloud Is Best? All of Them

    The public cloud is growing at a rapid clip, but then again so are private and hybrid architectures. So what does this mean for legacy data center infrastructure, and exactly how will future enterprises manage their application and data workloads across such a varied and dynamic resource environment?

    According to Gartner, public cloud service providers are on pace to top $260 billion in sales this year, representing an 18.5 percent increase over 2016. By 2020, the public market is expected to top $411 billion, about a quarter of which will come from enterprises migrating key business applications off of on-premises infrastructure. Perhaps most interestingly, Gartner says that Platform as a Service (PaaS) architectures are finally showing some life, generating some $11.4 billion in 2017 compared to $9 billion last year. PaaS is primarily viewed as a test/dev solution, which means the enterprise appears to be shifting its use of the public cloud from bulk storage and application services to the creative development of new revenue channels.

    But is this growth for real, or is it just a blip coinciding with overall economic prosperity? Tim Crawford, of advisory firm Avoa, notes that many enterprises tend to ramp up their cloud deployments at first, only to yank them back when they hit a certain level of scale. This yo-yo effect is driven largely by the mindset governing legacy data center provisioning in which organizations plan for things like 24/7 availability, peak utilization and redundancy. When these metrics are applied to the cloud, costs can typically exceed traditional infrastructure by a factor of four. And in many cases, the added latency, compliance and customization burdens in the cloud make on-premises the more appealing solution.

    One thing that can help equalize this imbalance is containers, says CIO’s Dwight Davis. Since containers do not require a hypervisor or their own operating system, they can be deployed on different platforms and different environments quickly and easily. This in turn allows the enterprise to reduce the consumption rate of cloud resources even as workloads continue to scale. And now that the container management landscape is starting to gel with tools like Kubernetes and Docker Swarm, organizations should have an easier time provisioning applications on best-available architectures rather than within the comparatively rigid constructs of traditional clouds.

    When it comes to the cloud vs. on-premises, however, the actual location of the application host is somewhat beside the point, says TechRepublic’s Keith Townsend. The more important question is where to put the data center control plane, and it seems like the industry is starting to shed traditional software solutions in favor of SaaS. Recent platforms like Skyport Systems, HPE’s new Hybrid IT Stack and Platform9 provide highly flexible control over distributed resources as well as rapidly evolving security features and optimization tools. True, these systems come with a learning curve, but this has more to do with the intricacies of hybrid cloud management in general, not the fact that the management stack is hosted outside the data center.

    Regardless of whether architectures are public, private or hybrid, it appears that it won’t be much longer before the entire data stack, except perhaps for the most rudimentary aspects, are lifted onto a virtual/cloud plane. The flexibility of emerging applications and the demands of today’s users all but require a dynamic, scalable approach to infrastructure deployment and provisioning.

    But the decision to deploy any given application or service in-house or on third-party clouds will depend on multiple performance characteristics and operational goals, which means that above all, the enterprise should look to maintain a variety of cloud solutions as it transitions to new digital business models.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles