For much of enterprise history, the data center was a necessary evil that provided a vital function but was not very conducive to efficiency and organization. Businesses expanded their data footprints as best they could, but at the end of the day, the knowledge workforce had to content itself with living within its means.
That paradigm has been thrown out the window with mobile communications and the cloud, which have given workers a sense of entitlement that whatever data or services needed to do their jobs will be available somehow. This means the data center has to kick its capabilities into high gear or risk losing control of the enterprise’s prized possession: its data.
The result is that data centers are becoming even more unwieldy at a faster rate, which is wreaking havoc on capital and operational costs. But it is also leading to the rise of advanced management platforms that seek to drive greater efficiency from both physical and virtual infrastructure, with particular attention being paid to energy consumption.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
Data Center Infrastructure Management (DCIM) is in fact a decades-old technology, but it has gained increased traction of late due to the enhanced ability to match power loads with data requirements, plus the data industry is seen as a major consumer of fossil fuels. As Nlyte’s Mark Gaydos points out, the rise of software-defined infrastructure, data mobility, Big Data and a host of other trends is fueling a perfect storm of demand for sophisticated tools that allow for the expansion of data footprints while keeping costs under control. And as the front office increases its demand for IT to start exercising fiscal discipline, expect DCIM deployments to increase over the coming decade.
While the fundamentals of DCIM will not change, says CA’s Lara Greden, some of the techniques are due for an upgrade. DCIM is and will remain focused on optimization, availability, efficiency, capacity and sustainability, but the toolkits within the various platforms are starting to stress things like better management rather than simple monitoring, improved resource control, greater speed to accommodate app-driven architectures and, above all, greater focus on the business needs of the enterprise data infrastructure. In that vein, DCIM will not revolve solely around cutting costs, but in improving performance as well.
At the end of the day, however, DCIM has to prove its worth just like any other initiative, says Moiz Vaswadawala of Indian developer GreenField Software. This can be rather difficult to ascertain with a complex product like DCIM because its influence over data infrastructure is so diverse. On the savings side, there are the obvious benefits of reduced energy consumption and greater resource utilization, but many IT executives fail to consider the many productivity benefits that DCIM engenders, such as increased availability, improved planning and development and reduced downtime.
Indeed, many of the top platforms are starting to stress asset management as a key benefit to DCIM, as opposed to simple power efficiency. Geist DCIM, for example, recently launched the Environet Asset module that provides full life cycle management of data resources and granular 3D visualization that provides for drag-and-drop connection mapping and configuration. This is backed by real-time reporting that allows data executives to quickly and easily optimize resource and data loads for both efficiency and productivity.
DCIM will remain a significant undertaking for the enterprise because it incorporates not just the data side of infrastructure but the facilities side as well. Naturally, this is challenging on a technology level, but it is compounded by the fact that the people in charge of these areas often come from different educational backgrounds and lack a common set of terms and outlooks.
Many smaller organizations will no doubt look at the requirements of Big Data and other emerging trends and decide that a data center is more trouble than it is worth. But for those who need to keep data infrastructure in-house, it is becoming very difficult to scale resources to the levels needed by modern applications without an overarching management solution to make sense of it all.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.