The New Enterprise: More at Home in the Cloud?

    Slide Show

    Improve Internet Performance by Diversifying Your Cloud Portfolio

    All of the talk surrounding the transition from traditional data centers to the cloud tends to focus on one thing: When will the enterprise feel comfortable about porting mission-critical applications and workloads to third-party infrastructure?

    This is a valid question, to be sure, but increasingly it seems to be missing the mark. Rather than wondering how the cloud will support today’s enterprise applications, we should be thinking in terms of emerging cloud-facing applications and how they will change the enterprise as we know it.

    This was one of the messages at Amazon’s re:Invent conference in Las Vegas last week. As Senior VP Andy Jassy noted, the company is seeing increased adoption of key platforms like the MySQL-compatible relational database service, eclipsing even the company’s RedShift warehousing service. The company has long touted its ability to support enterprise functions like raw storage and business intelligence, but lately it has been focusing on advanced analytics and other emerging functions that are starting to play an increasingly vital role in enterprise competitiveness in the new century.

    In this light, it’s hard not to look at large multinational companies like GE and wonder how much longer they will rely on the traditional data center for their constantly evolving business models, says Battery Ventures’ Adrian Cockcroft. GE is planning to shutter upwards of 30 of its own facilities and port their workloads, some of which are mission-critical, to Amazon. Even so, the company is looking at perhaps $100 million in capital expenses for the data centers that remain even though utilization rates are hovering around 20 percent. For companies that are under increasing pressure to trim expenses in order to remain competitive, that is a lot of money with relatively little to show for it.

    Is it reasonable, though, for any organization to push 100 percent of its workload to the virtual layer where it can then be migrated to the cloud? Probably not. Even VMware ran into problems when it attempted such virtualization over the past year, with issues ranging from software licensing to the sheer size of workloads. This isn’t to say that VMware doesn’t think an all-virtual data environment is impossible; however, it will take an extraordinary amount of fine-tuning to provide optimal support for emerging applications, particularly when it comes to storage and networking configurations between virtual machines and guest operating systems.

    One thing is certain, though, whether we are talking about the enterprise data center or the hyperscale cloud mega center, the name of the game will be efficiency going forward. Data loads are simply too large these days to support 4x and 5x redundancy that leaves 70 percent of available systems or more sitting idle all day. But while the financial incentives are obvious, concrete steps toward greater efficiency are a little trickier to discern, says Expedient’s Bryan Smith, and many of the common solutions, such as virtualization and cold aisle containment, have already been implemented. That means the remaining solutions are often more expensive and produce lower gains on the back end, which further drives many organizations into thinking that simply porting workloads to the cloud might make more sense.

    Again, though, most of these calculations center on finding the best way to support existing workloads rather than emerging ones. In the end, it may very well be that decisions regarding local data centers versus distributed cloud architectures will resolve themselves, more or less, as the applications that really matter to productivity and competitiveness will be more at home in the cloud than anywhere else.

    At that point, it won’t be a question of who produces the most efficient, cost-effective infrastructure, but who can structure their applications and workloads across all available resources to produce the greatest return on their investment.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles