Ever since the first virtual server went into production more than a dozen years ago, speculation has been rampant that enterprise hardware is doomed. But even though it is clear (to me, at least) that hardware will still play a vital role on the enterprise going forward, that role is changing. So the question for enterprise executives is not whether to give up on hardware altogether, but to assess what sort of functionality it is to provide and then to determine how to achieve that functionality at the lowest price point.
To some, developments like the cloud represent a threat not just to enterprise hardware, but software as well. Last year closed out with some pretty stark reports indicating that more money spent in the cloud translates directly into diminished revenue from enterprise users. The message to IT vendors is clear: Adapt to the new cloud reality, and fast, or face obsolescence within two years.
It is important to note, however, that many of those warnings come from the nascent cloud industry, which would love nothing better than to supplant longstanding tech giants like Cisco and HP as IT makes the transition from static infrastructure to new software-defined architectures. This is not outside the realm of possibility, mind you, as even IBM’s pending sale of low-end servers shows. On the one hand, it makes sense to spin off high-volume, low-margin product lines to more nimble manufacturers, but on the other, is it wise to shed commodity devices precisely at a time when dumb hardware is likely to become dominant?
It seems, then, that hardware’s role as the key enabler of higher-order enterprise functions is drawing to a close. In the future, servers in particular will become the blank canvas on which software artists will create their masterpieces. This puts companies like Quanta in an enviable position, first by getting in on the ground floor in the hyperscale data center movement led by companies like Facebook, and now as lead supporters of the Open Compute Project that seeks to foster hyperscale architectures to the rest of the enterprise community. The latest development comes courtesy of Fusion-io, which will incorporate its Flash storage modules into the Quanta Rackgo X platform to provide a broadly scalable, modular solution that can be used as the base for new white-box infrastructure.
Still, it would be a mistake to write off traditional enterprise manufacturers from this emerging market completely. HP, for one, has taken its share of lumps lately as it struggles to maintain existing product lines like PC and top-end enterprise systems in the new virtual/cloud era. But it at least has a white-box-style solution for hyperscale infrastructure built into its Project Moonshot program. True, it represents a single-vendor solution at a time when many are questioning the wisdom of that approach, but if the company can show a steady trend of rapid hyperscale deployment without the often lengthy and complicated integration issues that have plagued traditional open platforms, it could very well cap off its recent nosedive with a spectacular save.
Still, the enterprise executive suite should care less about the fortunes of suppliers and more about the future of infrastructure. And in that vein, it seems that commodity hardware will comprise most, but not all, of the foundation for agile cloud-based infrastructure. Critical data and applications, it seems, may continue its existence on customized platforms for a while longer, at least until economies of scale price these systems out of the general enterprise market.
Turnabout is fair play, after all, and there is some poetic justice in the fact that, after so many years of following vendor leads, the enterprise is in a position to start calling the shots now.