A Better Data Center for One and All

    Slide Show

    Building Software-Defined Control for Your Data Center

    No matter what happens in the data center over the next few years, whether it is software-defined networking, hyperscalability or a fully virtual data stack, it is comforting to know that the enterprise will come out the winner because infrastructure will be cheaper, more flexible and, in a word, better.

    This isn’t necessarily true for today’s crop of enterprise vendors, however. Fundamental changes like we are seeing now threaten to unleash entirely new concepts of what data is, where it ought to reside, and how best to leverage it. And some of these ideas might not conform to the business models of key suppliers.

    As I mentioned in a previous post, the biggest pain point going forward still looks to be in the server industry. With hyperscale infrastructure poised to become the next growth area, top-end data firms are more willing to purchase commodity ODM solutions than branded systems from HP, Dell or IBM. True, there will still be plenty of action from the traditional enterprise industry, but as more workloads migrate to the cloud, there will be less reason to keep investing in local infrastructure and more reason for regional cloud providers to go hyperscale.

    A ray of hope may still be had for vendors who specialize in unified communications and converged voice/data platforms, however. A recent survey from Nemertes Research indicates that more enterprises are leaning toward single-vendor solutions, an indication of the complexities involved in combining disparate systems under one yoke, says NoJitter’s Eric Krapf. While the desire for mixed solutions is still high, few organizations have the skills to operate in such an environment. The only unknown is whether middleware providers will be able to smooth out some of the wrinkles in multi-vendor environments, but the entire field is still too new to predict whether anyone will be successful.

    Meanwhile, a recent influx of start-ups is promising some interesting changes to the data center in other ways. One is Mesosphere, which has leveraged the Apache Mesos operating system to devise an open data center OS that the company says will simplify management, accommodate greater scale, and provide a friendlier environment for emerging applications like mobility and social networking. The concept essentially turns the data center into a giant computer capable of networking with other data center computers over the Web. For those not willing to make such a dramatic change, however, the platform can be deployed incrementally on select applications and infrastructure.

    Data Analytics

    Another is Primary Data, which is attempting to leverage data virtualization to lessen storage complexity. The company has snared Steve Wozniak as chief scientist and is planning to launch a system within the year that employs metadata to unify multiple storage media under a single domain without making changes to physical infrastructure. The metadata essentially maintains key information like file names and access control that can be tapped by the system’s custom hypervisor to enable rapid data retrieval and more efficient infrastructure management.

    With the decade nearly half over, the end-game of the great virtual/cloud transition is starting to come into focus. The once-lofty goal of a fully automated, anything-anywhere-anytime data environment is giving way to the messy reality that infrastructure will remain as complex entities with many moving parts, all of which need to be overseen by a human operator.

    But that doesn’t mean data infrastructure is out of fresh ideas. On the contrary, those with vision will continue to devise more efficient, effective means of supporting key applications and architectures, and will be better able to hide the discrepancies with software.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles