Making the Most of DCIM

Arthur Cole
Slide Show

Top 10 Strategic Technology Trends for 2016

It seems something of a misnomer that Data Center Infrastructure Management (DCIM) platforms are gaining in stature while the vast majority of enterprises are supposed to be de-emphasizing local resources in favor of the cloud.

But the trend is clear: Run-of-the-mill enterprises are turning to every means necessary to reduce costs and improve efficiencies within their on-premises infrastructure while large cloud providers and hyperscale organizations have no choice but to balance workloads against resource consumption or watch their business models collapse under the weight and complexity of their own IT operations.

The challenge going forward is not to simply deploy DCIM, says International Data Corp. in a new report, but to weigh the various DCIM platforms against emerging goals and technology developments. Not all DCIM solutions are the same; in fact, few of them are. Some focus largely on asset management and connectivity while others gear toward critical infrastructure and facilities control. Some are software-only while others introduce a mix of hosted services. Weighing the pros and cons will require a clear assessment of the nature of current infrastructure (is it converged, distributed or both?), as well as internal skillsets, plus future requirements in terms of scale, integration and automation.


Even today, too many organizations are relying on manual processes to drive data center efficiency, according to a recent survey by Intel. The company queried 200 UK and U.S. data center managers and found that 43 percent have yet to introduce an automated resource management platform, including some operations that have scaled up to thousands of servers. The key inhibitor is the high cost of current DCIM platforms and the resources needed to run them, even though more than 40 percent of those surveyed said they already spend 40 percent of their time on tasks like capacity planning. As well, about 60 percent say heat and other issues are diminishing performance even though the use of thermal sensors and other tools is widespread.

The key takeaway here is that all the data in the world is of little value if there is no way to act on it in a timely manner. This is part of the reason why Schneider and others are launching SaaS versions of their on-premises software platforms, to give organizations an easier way to deploy and manage DCIM capabilities according to operational needs. The company’s StruxureWare for Data Centers will see a hosted, pay-as-you-go version with the next release, version 8.0, due out in early 2016, which the company says will streamline both the deployment process and integration with legacy facility and resource management systems. At the same time, it should help lower costs and produce a better ROI for the enterprise.

The ROI question is a recurring theme throughout discussions on DCIM. According to the Uptime Institute, only 17 percent of enterprises expect a return on their DCIM investment within two years, while fully half don’t expect full ROI at all. Part of this may be due to the way in which DCIM information is leveraged: 75 percent of organizations look to the technology to solve space planning and environmental monitoring issues while only a quarter see it as a way to improve server utilization, oversee chargeback, or contribute to the overall financial analysis of data operations.

Ultimately, the entire issue could become moot for the vast majority of enterprises. With most applications and services hosted in the cloud, any local infrastructure that remains will be fully modular and targeted toward highly specialized, and most likely extremely critical, functions.

But all data needs a place of its own in the physical world of servers, storage and networking, which in turn must be powered and coordinated to promote reliability, availability and efficiency. DCIM can still deliver on that promise, but the organizations responsible for infrastructure will have to figure out not only the best means of deploying and supporting the technology but the best way to leverage the information it provides.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.