The data center is on a clear trajectory toward greater abstraction, greater resource distribution, and greater diversity in both the workloads it supports and the technologies it brings to bear.
All of this leads to an increasingly complex management challenge that pits the need for greater autonomy among users and applications against the needs of the enterprise to maintain data availability and security while keeping budgets under control.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
According to Shay Demmons, executive VP of BaseLayer’s RunSmart software division, this challenge is compounded by the fact that most organizations are branching into new IoT and service-level data architectures that must reach back to legacy infrastructure for crucial data support. This calls for a “looking forward, looking backward” management approach that, in fact, utilizes many of the same technologies that are driving the transition to digital services – things like sensor-driven data systems, advanced visibility and intelligent automation that propel workflow management and resource allocation to the speed of modern business.
Data Center Infrastructure Management (DCIM) is likely to play an increasingly important role going forward as the consequences of downtime become more relevant to the bottom line, says Schneider Electric’s Henrik Leerberg. A recent study by IDC noted that the difference in performance between facilities equipped with holistic, dynamic management tools and those relying on traditional static methods is becoming evident. In addition to reducing human error in data operations, which is a leading cause of downtime, DCIM also improves coordination between IT and facilities systems like power and cooling, and it enables a more granular view of internal processes to help guide hardware and software deployment into the future.
Some enterprises are turning to data center modeling and visualization software to gain a clearer picture of what is and is not working in their facilities. A company called nuPSYS just released the nuVIZ system that collates data from building, IT equipment and network connections to provide a 3D model of the data environment that operators can navigate to identify trouble spots. The system utilizes data from Excel lists, AutoCAD blueprints and Viso files to build 3D images and provides automated propagation of rack devices and a searchable database of IT systems and their properties. As well, it provides an integrated view of data systems and related network topologies to support real-time inventory management, capacity planning and environmental monitoring.
With so much data migrating to the cloud, however, many organizations are finding that deeper insight into their own data centers is only part of the management challenge. Emerging cloud management stacks are allowing organizations to better determine how their distributed data environments are functioning and what resources they are consuming. For instance, Nutanix recently teamed up with Enterprise Data as a Service (EDaaS) provider Actifio to bring data virtualization to application and service deployments, allowing organizations to optimize their environments in cases where they lack visibility into underlying infrastructure. In this way, the companies say they can speed up the creation of production environments 30-fold, eliminate resource under-utilization, and reduce the spread of non-production data.
The dream of a fully integrated, all-in-one management stack has been a moving target for decades. And now that technology has finally evolved to the point where such a thing is at least conceivable for the data center, data environments are pushing past the cloud and into the distributed edge facilities of the IoT.
But the fact remains that the more dependent the enterprise becomes on digital processes and services, the greater care it must take to ensure its data environment is running properly and that any issues can be corrected before they affect performance.
It’s a never-ending game in which the very same technologies that are exacerbating the problem are also providing the solution.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.