Turning Big Data into a Better Data Center

Arthur Cole
Slide Show

2014 Big Data Outlook: Opportunities and Challenges

In the digital version of “physician, heal thyself,” it seems that some large data organizations are utilizing Big Data and other advanced functions for their own purposes, namely, driving greater efficiency and performance in the data center.

It only makes sense, after all, that a construct as complicated as a virtual, dynamic data environment would need all the help it can get to not only provide an accurate picture of what is going on amid myriad boxes and wires, but also tell how best to improve things.

Google, for example, is turning toward advanced machine intelligence at some of its largest facilities with an eye toward fulfilling the twin goals of greater performance and less energy consumption. Through the use of neural networks and advanced analytics, the company says it is well on the way to the kind of predictive functionality that absorbs everything from IT loads, pump speeds, cooling metrics and hundreds of other data points. With advanced modeling, the company says it can calculate the expected PUE of a properly equipped facility with 99.6 percent accuracy.

At the same time, the U.S. General Services Administration (GSA), the agency responsible for a wide range of government operations, is using advanced analytics to pinpoint savings in data facilities across the nation. The Green Button program has already placed usage meters at 100 or so GSA buildings and plans to track 10 million or more data points within the next few years. The program then utilizes analytics software from FirstFuel and Schneider Electric to calculate energy usage and then encourage users to be mindful of their consumption. So far, the program has identified some $16 million in savings.

For the private sector, startups like CloudPhysics offer ready-made platforms designed to analyze traffic patterns and other data to make smart decisions regarding infrastructure and cloud deployments. The platform provides a validation tool aimed at simulating environments to gauge their performance under real-world conditions, a hazard prevention system that features early warning capabilities, and a Knowledge Base search function that analyzes posted user reports. As well, it provides an efficiency module that seeks to optimize performance and resource utilization across disparate infrastructure.

Leading Data Center Infrastructure Management (DCIM) platforms are turning toward Big Data as well. The aim, according to CA Technologies’ Dhesi Ananchaperumal, is to provide a more holistic approach to the data/energy conundrum so that gains on one side do not come at the expense of the other. Key capabilities include real-time data gathering and consistency within the analytics modules to ensure that decisions are not only based on real-world conditions but also on accurate comparisons of dispersed data environments. The last thing you want is a vast information superhighway at your disposal but the data needed to manage and maintain it is stuck in a back lane somewhere.

In this way, data becomes the driver of the data environment. This shouldn’t come as a big surprise given that Big Data analytics prides itself in its ability to turn complexity into simplicity. With modular, hyperscale infrastructure ready to take on the brunt of the steadily increasing worldwide data load, only a highly sophisticated analytics stack will prevent the environment from collapsing in upon itself.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.