More

    The Virtual Journey Is Not Over Yet

    Slide Show

    Debunking the Top Data Center Myths

    Virtualization may be as common in the data center as seashells at the beach, but that doesn’t mean IT no longer has to worry about proper management or functionality.

    Resource contention remains a key problem. Too many VMs searching for too few physical resources leads to delayed access and poor overall performance. New tools and techniques continue to emerge to deal with the problem, but at heart it can be solved only by proper management – data executives need to remain fully aware of what’s going on within their facilities.

    This is part of the reason why companies like Flash specialist SolidFire are seeing increased activity in the enterprise sector. Resource contention has long been a concern among cloud providers who are eager to prevent one client from interfering with the service of another, but lately that same ethos is taking hold in the enterprise, thus the need for increased Flash storage capacity and the ability to effectively manage storage I/O. As the company’s marketing VP Jay Prassl explained to The Register, internal business units need nannying too so as not to inhibit overall productivity in a bid to boost their own.

    Running large databases on virtual infrastructure can often lead to resource contention as well. A recent survey by SolarWinds indicated that more than three-quarters of enterprises have virtualized their databases over hybrid virtual/physical infrastructure. Admins can simplify the interaction between workloads and underlying systems by fostering increased collaboration between teams, as well as deploying advanced query performance, response time and other metrics designed to measure database output rather than system load. Often, the impact on available resources caused by competing workloads is not evident until users start complaining.

    Still, a key element in the resource contention conundrum is caused by the imbalance between virtual workloads and physical resources. Too much virtualization in the server farm leads to I/O conflicts in the storage array. The solution, of course, is to ramp up virtualization across the data infrastructure, primarily in storage and networking, so that resource flexibility can be evenly distributed across the entire data environment. SanDisk’s Hemant Gaidhani noted recently that enterprises that have increased their reliance on software-defined storage and networking have a greater ability to provision the entire data chain, not just the server component, which allows resources of all types to be provisioned, utilized and then kicked back into the available pool at a rapid rate with little-to-no hands-on involvement from administrators.

    The need to provide adequate resources is also causing the enterprise to deploy physical infrastructure as quickly and with the least cost as possible. That’s primarily the reason container-based solutions are gaining such a strong following in the data center, according to Unisys’ Colin Lacey. The idea is to build a secure set of resources that can be brought online in relatively short order, gaining not just the ability to scale infrastructure but to consolidate data footprints as well. And many organizations are also finding that they can migrate workloads from Unix to Linux while retaining many of the security and performance characteristics of their proprietary systems.

    Server virtualization was indeed a revolutionary development in the modern data center era, but it was only a partial solution. Now that processing can be distributed across a range of software-defined resources, it’s time to bring the rest of the data center up to speed.

    But don’t be fooled into thinking this is just a matter of deploying the right technologies and building all-virtual architectures. It will take a fair amount of management prowess and new skills among data admins and the knowledge workforce in general to make it happen.

    Eventually, though, we’ll look back and wonder how we ever got along without it.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles