Data Centers: To Build or Not to Build

    Slide Show

    Debunking the Top Data Center Myths

    Decisions regarding the building and operation of the data center have been the purview of the CIO since, well, since the beginning of IT as we know it. Only lately, however, has the industry confronted the question: Do you want to have a data center at all?

    On the surface, it seems that the data center construction industry is hitting on all cylinders. According to ResearchandMarkets, the data center construction market is on pace to see nearly 22 percent annual growth until 2018 at least. A quick peek under the hood, however, reveals that much of the boom is driven by heightened demand for colocation and cloud services. It could very well be, then, that the boom of the next few years will produce large, regional, modularized infrastructure that will stunt the need for local enterprise infrastructure, ultimately putting the O&O market into a tailspin.

    Indeed, top platform providers are already working toward this future, in part by blurring the line between “own” and “lease.” HP, for example, just launched a new facility as a service (FaaS) program designed to cut the cost of full construction while allowing the enterprise to continue managing its own infrastructure. The program involves HP building a modular data center to client specifications and then leasing it on a monthly basis. The enterprise retains full control of management, security and other operational functions while HP takes care of maintenance and upgrades.

    As well, many systems integrators are turning toward software-defined systems and infrastructure as the next major growth area. Wharfedale Technologies (WFT), for one, recently launched a program aimed at migrating legacy infrastructure to new Software-Defined Data Center platforms based on EMC and VMware technology. Much of the transformation will take place on existing infrastructure, although a software footing will likely produce greater ties to external infrastructure while driving efficiency and utilization of local hardware to new levels. Either way, the result is the same: more of the load shifted to the outside with a smaller internal footprint reserved for sensitive or critical data and applications.

    A renewed push for more open source technology in the enterprise data centers has also come about. Dell and Red Hat teamed up recently to foster the OpenStack platform for the private and hybrid cloud. The companies are also working on a new platform as a service offering based on Red Hat’s OpenShift initiative that is designed to entice developers into the world of open source app development. The intent is to drive as much commonality as possible in both internal and external clouds so as to make it all but indistinguishable to the end user, particularly as new mobile and social platforms foster greater interoperability among diverse data environments.

    In the end, it’s not like the enterprise data center is doomed. On the contrary, the in-house infrastructure that remains will likely be of the highest quality and geared toward supporting the most high-value data sets and applications within the organization.

    But the bulk of the market will likely go toward the burgeoning cloud industry, which counts scalability as one of its primary assets.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles