The data center of the future is on the mind of many a system architect these days. With data volumes on the rise, along with calls for more efficient and effective energy usage, designers must perform a tricky dance to satisfy all the competing interests surrounding infrastructure development.
For many, the obvious answer is modular construction. Whether it is housed in large shipping containers or traditional bricks-and-mortar facilities, modular or converged infrastructure is seen as a convenient means to accomplish low-cost, scale-out data environments.
While modularity is usually pitched as an energy-conscious decision, it bears noting that the bulk of the enterprise industry is still more interested in data-related functions like networking and availability. According to a recent survey of mid-market enterprises by Forrester, connectivity, resilience and control are the top priorities when it comes to data operations. This means specifics like carrier availability and density are foremost in the selection criteria, followed by management and control architectures and access to the cloud. Lower on the agenda are issues like power and cooling, which isn’t to say they should be discounted, but they tend not to be first and foremost in the minds of CIOs. The top priority, as always, is providing reliable service to stakeholders.
So far, at least, modular construction is proving its worth as both a reliable and efficient model. Indeed, it seems that the biggest boosters of modular infrastructure are the systems developers themselves. HP, for example, recently launched its Facility as a Service program built on its own Flexible DC and POD product lines. Part cloud service, part colocation platform, the idea is to lease capacity to enterprise customers under five-year rolling service agreements, effectively converting data operations from the capex-heavy standard model that has served as HP’s bread-and-butter over the years to a purely opex proposition. The program offers data facilities in either 20- or 40-foot containers, with energy loads ranging from 500 kW to 1,500 kW. Custom designs are available, as well.
Also moving into the containerized space is Schneider Electric, which this month plans to showcase its newest prefabricated data center to the energy-conscious European market at Datacentres Europe in Monaco. The entire portfolio is built on technology acquired from AST Modular and now numbers 15 separate IT/power/cooling modules and 14 reference designs, including a 45-foot enclosure featuring 10 full IT racks and the APC Symmetra PX 48 kW power supply. As well, it features remote monitoring and security access and in-row or block-based water cooling.
Modularity also dovetails nicely with existing data facilities. iFortress offers a panel-based modular data solution designed to fit within virtually any space that is available, providing a quick and easy solution to scale-out infrastructure challenges. The panels are two feet wide and four inches thick and can be deployed as server-ready components or in a fully sheltered configuration that is both air- and water-tight. The panels can be configured in a variety of ways to make use of available space, but also provide seamless integration to enable a fully contiguous data environment.
As mentioned above, however, it’s the mechanics of the data infrastructure, more than environmental efficiency, that drive most deployment decisions these days. Modular systems offer a lot of advantages up front in terms of cost and simplicity, but it is still unclear how well they hold up over the long haul.
By necessity, data infrastructure must become leaner and meaner over the next decade, but that will not come about at the expense of optimal performance.