More

    Virtualization Hits the Enterprise Comfort Zone

    Slide Show

    The Five Dos and Don’ts of Virtualization

    Virtualization is well into its second decade in the modern enterprise era (it actually had a previous life in the earlier mainframe days), and it has now become so ubiquitous that it is getting hard to find server infrastructure that has not been virtualized.

    For Gartner, this is a clear sign that the technology has matured, and may even reach its peak this year with an estimated market value of about $5.6 billion. While this is an increase of 5.7 percent over 2015, the firm notes that new software licenses have posted their first decline in more than a decade and growth is now led by maintenance revenue. In other words, it looks like the industry will sustain itself going forward by shoring up its installed base rather than finding new deployments.

    But as with people, being labeled “mature” is not always a compliment. And since virtualization is practically synonymous with VMware, the company has taken great pains to prove that its best days are still ahead even though its signature technology is not the growth engine it once was. After all, the server is not the only piece of hardware that is ripe for virtualization, and the company’s virtual networking platform, NSX, doubled in first-quarter deployments over the same period last year. While this is still a tiny fraction of VMware’s total revenue, it nevertheless points out that the enterprise is eager to push virtualization well beyond the server farm as it struggles to build the kind of abstract data architectures needed to compete in an increasingly service-based economy.

    That is, of course, unless something better comes along. As ServerWatch’s Pedro Hernandez points out, both cloud computing and containers stand a chance of undermining further deployment of straight-up virtual architectures even though both can and do fit comfortably within virtualized infrastructure. According to application delivery provider NGINX, 20 percent of enterprises are using containers for production workloads, with a third of those already converting the bulk of their applications to container-based architectures. As more and more organizations gravitate toward full software-defined data centers (SDDCs), traditional virtualization will continue to be challenged by newer developments that excel at consolidation, flexibility, scalability and other critical attributes.

    Virtualization has always been about freeing the data environment from underlying hardware, and this has had a profound effect on the way we view IT infrastructure. But it has never been a complete win-win for the enterprise because data itself was still locked in a one-to-one relationship with individual users, which means that if multiple users wanted simultaneous access to the same data set, it would have to be copied and stored somewhere.

    This is where the concept of data virtualization, or more specifically copy data virtualization (CDV), comes in. As explained by Ash Ashutosh CEO of data management firm Actifio, CDV enables the creation of a single “golden master” version of data, which can then be turned into virtual copies of production quality data for everyone on the organizational chart. Not only does this streamline workflows and speed up production cycles, it cuts down on storage hardware and storage management costs – both of which are crucial considerations as the enterprise takes on the challenges of Big Data and the IoT.

    Clearly, then, we have a long way to go before we can declare the death of virtualization. Heck, we haven’t even seen the death of the mainframe yet. But as a mature technology, virtualization is not likely to be the headline-grabbing technology it once was. This is not a bad thing, of course, because it means it has evolved from a risky, uncertain proposition to an accepted facet of the enterprise infrastructure.

    The question going forward, though, is, do you build advanced application architectures on the traditional, familiar virtual layer, or do you rely on something new that may or may not reach the same level of maturity some day?

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles