According to recent reports, the Data Center Infrastructure Management (DCIM) movement is finally poised to hit the enterprise mainstream, driven largely by the need to scale resources to match burgeoning data loads while keeping a lid on operating costs.
The chief culprit on the cost side of scale-up/out architecture, of course, is energy consumption, so a key component of modern DCIM platforms is the ability to shift energy flow toward active components while leaving idle resources with the bare minimum to maintain their ready status.
MarketsandMarkets’ latest report on DCIM predicts a $3.14 billion market as early as 2017, which would give it an annual growth rate of 47.33 percent from today’s roughly $307 million. This growth will be largely driven by the banking and financial industries—particularly in the North America region—although Asia and the Pacific Rim are poised to ramp up adoption in the near future as well. In addition to better managing data and energy needs, DCIM also provides critical insight into the inherent value of existing data center assets and how facilities and data environments can be modified and upgraded to meet future expectations.
Already, many DCIM vendors are adding predictive modeling and other tools to not only help the enterprise run the infrastructure they have, but also to provide a more accurate guide as to what they will need in the future. Future Facilities’ latest release of the 6SigmaDC platform, for instance, provides up to a six-fold increase in predictive modeling performance, plus connectivity to numerous third-party monitoring tools to enable fluid change management in highly dynamic workflow environments. In this way, organizations are better able to reduce risk and investigate more options when seeking to revamp systems and architectures to accommodate new data requirements.
In fact, the ability to plan for the future should be one of the top requirements when evaluating any DCIM solution, says Nlyte’s Matt Bushell. As organizations boost their capabilities to handle Big Data and new mobile data streams, DCIM enables a more proactive approach to infrastructure development, as opposed to the largely reactive posture that the enterprise has employed in the past. In the future, data centers will need to be not just scalable, but also agile, and DCIM provides the means to tweak both data and energy resources as new usage patterns arise.
Still another sign that DCIM is entering a more mature phase is the rise of new solutions targeting key vertical markets. Switzerland’s ABB, for example, recently released the Education Edition of its Decathlon platform that the company says will help learning institutions meet the growing demand for mobile applications and massive on-line open courses (MOOCs), many of which feature audio, video and other rich media content. The system provides real-time power and cooling monitoring, as well as regular reports detailing energy-saving opportunities and other suggestions to drive the efficient use of resources. It also comes with a special pricing structure for accredited institutions.
As you would expect, a DCIM solution is most effective in large facilities, but if predictions regarding cloud computing and hosted services hold up, consolidation of data center resources into large plants and even regional, utility-style operations is the wave of the future. The need for scale, then, will be paramount, but the enterprise industry can’t simply build out data infrastructure as it is configured today – the costs and complexity would be too great.
Future upgrades must be done with more coordination, and with a sharper eye on total operating costs than anything we’ve seen before.