Zombie Servers Still Haunt the Data Center

    Slide Show

    Building High-Growth IT: 5 Things to Know Now

    In an age when virtualization, the cloud and mobile computing increasingly define the enterprise data environment, it’s important to take a step back every once in a while and focus on the fundamentals of IT operations.

    A key element that is still overlooked at many organizations is server utilization. While much of the focus recently has been on upping the CPU usage on individual servers through virtualization and dynamic load balancing, far less attention has gone to identifying and decommissioning servers that have gone idle.

    These so-called “zombie” or “comatose” servers have become increasingly prevalent in the virtual era. Line-of-business managers have grown accustomed to spinning up their own resources for finite periods of time, but then find the decommissioning process to be time-consuming and not particularly germane to their objectives. Once this happens, the server just hums along to no purpose and is exceptionally difficult to identify and manage without the right software.

    This isn’t a minor problem at most enterprises, says BaseLayer’s Susanna Kass. A recent McKinsey & Co. report estimated that up to 30 percent of servers across the data center landscape – about 10 million worldwide – are comatose, generating costs in excess of $30 billion per year. These expenses are multi-faceted, ranging from the energy consumed by the server and its supporting systems to the capex considerations for organizations faced with scaling up data infrastructure but never realizing the capacity they seek is already in place.

    Comatose servers also represent a security threat, particularly in the era of Big Data and the IoT, says BMC’s Bill Berutti. An idle server provides the perfect entry point for malicious code, often because they are not included in software patches and security updates. As advanced analytics start to stress real-time performance, systems monitoring and maintenance will have to jump to the same level, but this is difficult to do if the enterprise does not have a clear understanding of their infrastructure and operations.

    Somewhat ironically, the cloud is emerging as a key asset in improving visibility across the entire data ecosystem, says InformationWeek’s Charles Babcock. Increasingly, systems management software is shifting from traditional licensing models to SaaS, which helps organizations manage costs better and provides a smoother upgrade path as new services and capabilities are introduced. According to strategic advisors from Parthenon-EY of Ernst & Young LLP, between 5 percent and 15 percent of management solutions are currently delivered by SaaS, but that figure is expected to rise to 40 percent in short order.

    However, technology alone will not be enough to combat the problem of comatose servers, says Julian Kudritzki, COO of the Uptime Institute. What’s needed is a new organizational chart that places responsibility for resource consumption across multiple stakeholders. All too often, the decision to access a particular server is made without IT’s involvement, which means that it continues to run in plain sight of technicians long after the application it supports is no longer used. By integrating IT more closely with business processes, organizations stand to gain millions, if not billions, in lost revenue.

    Whether it’s in the home, office or the government, waste is never good. With the data center industry still feeling the pressure to reduce energy consumption while increasing both service levels and revenue, the need to identify and reclaim non-performing assets is becoming critical.

    A single virtual server sitting idle is of little consequence, but if left unchecked, the process that led to its abandonment can put quite a damper on the ability to grow and innovate in a digital economy.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles