Why Predictions for the End of the Data Center Are Premature

Arthur Cole
Slide Show

Key Security Considerations for Enterprise Cloud Deployments

Every once in a while, talk of the all-cloud data center starts to circulate throughout professional IT circles. While most people are quick to dismiss this notion, it’s important to note the distinction between fully cloud-based data architecture and the end of the traditional data center as we know it.

In short, many organizations will likely stick with in-house infrastructure for some time to come, but others could reap tremendous benefits by outsourcing their entire data environment, at least in the short term.

A case in point is Infor Inc., which built its software business entirely in the cloud and now specializes in application-centric business solutions that allow other organizations to do the same. The company claims its lack of a data center allows it to focus more of its energy on development and other business-facing concerns and gives it an edge against well-heeled competitors like SAP and Oracle. The company utilizes an open framework and public providers like Amazon, and is looking to port some of its Big Data needs onto Amazon’s RedShift platform or possibly the IBM cloud. Company executives say that manpower costs alone are enough to deter them from building their own facilities for the foreseeable future.

But is cost alone enough to spur established enterprises to ditch their local resources? Silicon Angle’s David Coursey thinks so, and sooner rather than later. He predicts that the last American enterprise data center will close its doors by 2020, putting the entire data industry on a utility-style footing similar to the way manufacturing had switched to the electrical grid by the 1920s. I beg to differ, however, based on the fact that data is not electricity. Data is personal and its value varies based on what it represents and how it is aligned with other data. So the wholesale conversion of the entire data industry to the cloud is iffy at best, because there will always be pockets of information that people will want to keep very close to the vest.

This paradigm is evident in the latest research from BT Group. The company surveyed data professionals around the world and found that 76 percent cite security as their main concern when provisioning cloud-based services. This represents a 10 percent increase over a similar study in 2012, despite broad efforts among cloud providers to overcome a number of high-profile losses earlier in the decade. Perhaps most ominously, 41 percent of respondents feel that cloud infrastructure is inherently insecure, meaning that no amount of cajoling will convince them otherwise.

Of course, today’s cloud is not necessarily the cloud that will dominate, or even exist, in the long run. The current model has large single-entity providers like Amazon and Google delivering bulk services, with smaller players aiming to lure enterprise users with specialized services and broad support. But while nearly all of these players utilize large, scale-out data centers, researchers at the University of Bologna say they have devised a new set of protocols to enable peer-to-peer cloud networking, allowing workloads to be distributed across multiple resource sets. They say this enhances the reach of infrastructure to put data closer to users and provides greater security and resiliency by lessening dependence on a single cloud center. The system is essentially a tweak to existing file-sharing protocols, except that the sharing is done on the resource level, conceivably allowing cloud architectures to be distributed across legions of consumer compute, storage and networking gear.

Data Center

So, to answer the question: Will we have an all-cloud data center soon? The answer is yes. In fact, we have them now. But if the question is whether all data environments will be sent to the cloud, the answer is no, not even on private architectures.

For an industry that is still hesitant to lift certain critical applications off of bare metal onto virtual resources, the idea of complete reliance on the cloud is still way ahead of its time.

While all the news coverage of the cloud may make it seem as if everyone and everything is leaving the data center behind, the reality is that many CIOs are just taking their first baby steps in that direction, and many are concluding that it is not necessarily the right infrastructure for every application.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Add Comment      Leave a comment on this blog post
Oct 1, 2014 6:15 AM Chip Freund Chip Freund  says:
Arthur, you are spot on that not all workloads are going to be on public cloud in the next 5+ years. I believe that many enterprises will look to a combination of cloud services and colocation to address their IT infrastructure needs. Given the uncertainty around the migration of individual workloads to cloud, it is hard to build a reliable data center capacity plan at the enterprise level. By leveraging colo, businesses can still design, manage, and control their IT stack, but allow a service provider to deal with the facility, power, cooling, and connectivity. Not only does colo provide greater flexibility over in-house data centers, but enterprise can benefit from the experience and deep expertise in facilities management that a top-tier colo provider brings to the table. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.