New Tools to Visualize the Data Center

    Slide Show

    Top Trends Driving the Data Center in 2015

    If a picture is worth a thousand words, what is a good graphic visualization worth?

    For nearly the entire history of the data center, resource management has been a matter of interpreting various data sets regarding throughput and utilization and then making the necessary adjustments if the numbers get too high, or too low.

    With the rise of virtualization and abstract layers of infrastructure, however, many organizations are turning to advanced visualization tools that allow them to see what is actually happening in their data environments. And as normal operating workloads give way to Big Data and IoT streams, this approach starts to supplant earlier alphanumeric platforms.

    Fujitsu Labs recently pushed visualization down to the processor level with a system that calculates energy requirements for various workloads. Due to be presented in Beppu, Japan, next week, the platform is aimed at driving energy efficiency through an advanced power-control mechanism that determines software energy needs on a core-by-core basis. Ultimately, this should lead to the development of advanced power management software stacks that can function on such fine-grained operational levels as clock cycles and cache-hit percentages.

    Meanwhile, Puppet Labs has released an update of its data center automation system to give it the ability to visualize infrastructure as code in a bid to allow for a more Dev/Ops-style of management, says Mike Vizard at Datacenter Knowledge. Using the Interactive Node Graph module, data center managers will be able to dynamically visualize the various infrastructure models that are defined within the Puppet Labs platform. In this way, the system can accommodate changes in workflow and other parameters faster and with far greater accuracy. At the same time, the company has unified the agent software in the open source version of the platform with the commercial version to simplify the transition to open data center automation.

    Many organizations are also implementing the same sensor-driven data streams in the data center that are populating complex environments elsewhere in the economy, so the need to produce visual representations of this data is strong. RF Code, a specialist in sensor technology, recently released the CenterScape component of its Workplace IoT Platform that, among other things, provides 3D visualization of data activity that can then be tied to a wide range of third-party management systems and business applications. In this way, organizations are able to do away with time-consuming processes like manual audits and resource management and provide more responsive approaches to issues like temperature and humidity fluctuation, power management and asset regulation.

    Data Center

    Visualization is also expected to be a boon to the emerging field of Data Center Infrastructure Management (DCIM). The latest release of CommScope’s iTRACS platform contains the SimpleView interface that, according to tech journalist Jason Verge, provides for improved role-based management and enhanced privacy and security. The system enables a visual mechanism through which operators can monitor, manage and interact with physical infrastructure, and can even extend to colocation and multitenant environments via a secure-partitioning feature. As a web-based tool, SimpleView not only provides an easier means to gauge data and resource operations, but enables access from a variety of devices as well.

    Graphic visualization of complex data environments is likely to push the role of resource management away from the specialists and into the realm of mainstream data users, which itself presents a problem for the enterprise. How much control is enough? How will resource contention and other conflicts be handled when every knowledge worker can optimize their own data environment?

    In this light, it is important to remember that visualization is simply another tool in the management box, and that its true value will come from the way it is implemented within the emerging dynamic data ecosystem.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles