Do you remember the car you were driving 20 years ago? How about the TV set you watched? These and other products were perfectly suited to that era, and with the proper upkeep would probably be fully functional today – although by now you likely would have moved on to newer, better things.
So why do we continue to populate our data centers with seriously aging technology, particularly now that we are on the cusp of a brave new computing world?
According to a recent survey by Brocade, a good number of facilities operate with technologies and architectural designs that date back 20 years or more. While it’s true that much of that infrastructure has been, or is in the process of being, revamped with virtualization and other techniques, the fact is that much of the hardware and software infrastructure is simply not up to the task of handling the diverse and dynamic data loads of a mobile, software-defined data ecosystem. Clearly, the data center is in need of substantial modernization, and sooner rather than later.
Fortunately, with modular platforms and converged infrastructure, it is relatively easy to field new infrastructure that is designed from the ground up for software-defined architectures, Big Data and many of the other data requirements confronting the enterprise. I say ‘relatively easy’ because, while it is no walk in the park, it is certainly less confounding than a rebuild of an entire old-style data center.
Indeed, as noted in recent research from QuinStreet Enterprise, a thorough modernization policy will provide not only improved data performance, but better security, greater uptime and availability, and a vastly streamlined and cost-effective data infrastructure. Of course, “modernization” is a broad term, and as the survey notes, enterprise executives use it to describe everything from straight-up virtualization and Flash storage to converged infrastructure and advanced analytical tools.
But while the process may be different for everyone, the goal should be the same, according to HP’s John Bennett, and that is to embrace the “new style” of IT. This is a world in which applications and data are no longer tucked safely behind the firewall and IT is in absolute command of all things data-related. The new data environment needs to be adaptable to change and provide a high degree of autonomy so users can achieve their goals quickly and without worrying where resources are based or how their data needs are being met. Modernization, then, requires not just new infrastructure, but an entirely new operating framework that incorporates internal, external and hybrid resources.
At the same time, however, don’t overlook the fact that your business model, not technology, is now the primary driver for the data environment you choose to build, says tech consultant Bill Kleyman. Under the old rules, business functions were limited to what the data center could provide. Nowadays, architectures can be built, torn down and rebuilt to suit key objectives, so poor data support is no longer a valid excuse for failure. Do you need massive scalability to suit diverse client devices? There’s an architecture for that. How about massive number crunching for Big Data analytics? There’s an architecture for that, too. A modern, software-defined data environment should be able to accommodate all of these needs and more, but you have to get started now.
To be sure, you have no need to panic. Modernization across the enterprise industry will happen as budgets and capabilities allow, most likely following the standard Bell Curve pattern of relatively few early adopters at first followed by a rapid upswing into the general population and then finally the Luddites.
But while it is fair to say that modernization is a continual process, the fact is that the next five years or so will produce the most profound changes in the data center since the development of the client-server architecture. And the final results will depend very much on the foundations being laid today.