The difference between a manageable problem and a crisis is often one of anticipation. The sooner you know something is coming at you, the easier it is to take steps to avoid it or at least minimize its impact.
In that way, virtualization’s influence on the enterprise hardware market, while profound, is manageable, at least for the manufacturers that are most affected by it.
Witness the rapidly changing strategies of the top enterprise platform providers: HP, IBM and Dell. All three are quickly turning production responsibilities for servers and other hardware to third-party manufacturers while they concentrate increasingly on software, services and advanced cloud architectures. This is not merely a question of survival in a cut-throat market, but recognition of the fact that the new data environment will be radically different from the old and that those who fail to roll with the changes will be powerless to influence those changes as they unfold.
But it’s not just hardware makers that are feeling the pinch. VMware, long-time occupier of virtualization’s vanguard, is looking at the approach of full saturation of the server market, with some 70 percent of suitable workloads residing in virtual environments by year’s end. That means the company’s flagship platform vSphere could be in for steadily diminishing returns, even as rivals like Hyper-V gather more followers in the enterprise.
Small wonder, then, that the company is increasingly diversifying its portfolio with advanced techniques like software-defined networking (SDN) and sophisticated automation technologies designed to bridge the physical and logical differences between traditional enterprise architectures and the cloud. The latest move in this direction comes in the form of a $30 million investment in Puppet Labs, developers of an open-source automation and configuration management stack currently found at Twitter, Disney, NYSE, Cisco and other top-tier enterprises.
But while the changes taking place in enterprise infrastructure are certainly real, beware of marketing pitches that seek to capitalize on trend words like virtualization and SDN but in fact conform to the same old hardware/software paradigm, according to DataCore CEO George Teixeira. One way to spot this shell game is that it requires the purchase of new hardware in order to take advantage of the software magic. In fact, it should be obvious by now that software architectures will endure regardless of the changes made to hardware, even in mission-critical environments, so there is no longer a need to link the two. Software-defined architectures are fine, but not if they lock you into someone’s hardware.
This isn’t to say that hardware is of little consequence, however. Issues like power consumption, streamlined architecture and system reliability will still be the primary drivers behind hardware deployments. But these issues will be largely separate from the higher-order service and productivity issues that will drive much of the business activity going forward.
Both layers are vital to the overall health of the data environment, even if they are not so intertwined as they once were.