Bigger and faster. Those two words will be the running theme for storage infrastructure in the coming years as the enterprise steps up to the demands of Big Data, collaboration and other advancing initiatives.
But even though these two goals are relatively clear-cut, determining exactly how they are to be accomplished is still in limbo, with the biggest question remaining: Where should the bulk of storage infrastructure reside, at home or in the public cloud?
The ramifications of these decisions are already playing out in the vendor community in the form of continued consolidation. Following the much publicized merger between Dell and EMC late last year, NetApp announced an $870 million purchase of Flash developer SolidFire. Not only does this bring a scale-out, all-Flash array into NetApp’s portfolio, it also provides advanced data management and software-defined storage capabilities that will allow the company to compete more firmly for the highly dynamic data architectures that are taking hold in the enterprise and in the cloud. NetApp is already targeting web-scale applications like Hadoop and the rising tide of Dev/Ops functions that are poised to remake IT architecture.
Once the Internet of Things kicks into high gear, of course, demand for highly scalable storage is expected to skyrocket, and this is the point at which the enterprise will have to confront the in-house/external debate. Top providers like Amazon, Microsoft and Google are still racing to the bottom in terms of pricing, says The Register’s Chris Mellor, but can they provide the speed and availability that come with local storage infrastructure? In all likelihood, no single storage solution will provide all of the increasingly specialized requirements of emerging workloads, so storage infrastructure will likely proceed on a number of fronts in the coming years – everything from bulk cloud services to hybrid arrays and on-server memory architectures.
When charting the future of the enterprise, however, it helps to look at the current state of high-performance computing (HPC) applications like genetic sequencing. And on that score, top cloud providers like Google and AWS are upping their game in terms of storage, IO and networking, which are becoming increasingly important as big research facilities gravitate toward sharing and collaboration rather than raw compute power, according to George Washington University senior fellow Kalev Leetaru. Not only can they provide resources at lower cost – hosting a complete human genome can run as little as $3 per month these days – but they are introducing a wide range of specialized services, such as cutting-edge security and HIPAA compliance. And services like the Internet Archive can ensure that these data sets are available almost indefinitely.
To the standard enterprise, this may seem like overkill – until you realize that web-scale infrastructure is likely to take hold in organizations of all sizes due to its high efficiency and flexible design. As Information Age’s Chloe Green points out, meeting the demands of Big Data and the IoT will be a budgetary challenge more than anything else, and legacy architectures will simply become too expensive and too difficult to manage once web-scale workloads ramp up. Gartner, for one, predicts that by 2017 half of the global enterprise market will have adopted web-scale architectures as a means of cost-effective growth management.
Whether you’re talking about storage, networking or processing, scale is not just about size. Sure, there must be enough resources to handle the load, but they must be provided in a coordinated, automated fashion. The end game, after all, is to support the business process in the most efficient, effective way to keep up with the increasingly rapid pace of the emerging digital economy.
Scaling storage infrastructure up and out using a variety of on-premises and public cloud solutions will not be the simplest of tasks, but it is the best way to ensure that the right data solution can be delivered on time and on budget.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.