Enterprises Challenged by Storage Diversity

Arthur Cole
Slide Show

A Guide to the Different Varieties of Flash Technology

Despite the tremendous gains it has made over the past decade, storage is still lagging behind its compute and networking counterparts in terms of speed and performance.

This isn’t an indictment of storage itself, mind you, as technologies like Flash and other forms of solid-state infrastructure have done wonders for both speed and throughput in advanced enterprise settings. Rather, it is in the support infrastructure surrounding physical storage where most of the bottlenecks remain.

Latency in the storage farm, in fact, is increasingly seen as an impediment to many higher order data center functions, such as virtualization and cloud computing. According to a recent survey from PernixData, a vendor of server-side Flash solutions, about half of respondents say storage performance is a higher priority than additional capacity, while only 21 percent cited capacity as a priority. As well, the survey has upwards of 70 percent of respondents considering storage acceleration software to help boost performance. A key driver in this shortage of performance continues to be the proliferation of virtual machines, which tends to flood storage infrastructure with more requests than it can handle.

The good news is that Flash storage itself is coming down in price to where even mid-level enterprises can deploy it without too much trouble. SolidFire, for example, offers a four-node entry-level array, the SF2405, for less than $100,000. The device features forty 240GB SSDs spread over four standard racks and is capable of providing upwards of 200,000 IOPS across an effective capacity (raw capacity plus dedupe and compression) of 35 B. The company is pitching the set-up as an accelerated storage solution for key high-speed applications while laying the foundation for private clouds.

Of course, it’s hard to gauge the success or failure of any one storage technology now that systems and architectures are becoming software-defined. The problem is, there is no clear definition of “software-defined storage,” which gives vendors free rein to define it any way they choose – whether it comes in the form of virtual storage appliances, server-side controller software or advanced orchestration and data management. But as Enterprise Storage Forum’s Christine Taylor points out, different types of software-defined storage will produce different performance characteristics depending on the applications they encounter. For instance, some approaches will work well with diverse, dynamic data environments while others are more adept at unifying and consolidating sprawling storage infrastructure. The key indicator of a successful software-defined storage deployment is an overall improvement in storage quality of service.

Data Storage

Cost and complexity in the storage environment are two of the main reasons why many organizations are pushing more of their infrastructure onto the cloud, says Nasuni CEO Andres Rodriguez. Regardless of how you feel about cloud security and availability, it cannot be denied that owning and operating storage infrastructure is a tremendous burden that diverts valuable resources from business processes and the development of new markets and revenue streams. In the cloud, you get top-notch infrastructure and all the tools needed to manage data and resource consumption at a price point that is dramatically lower than traditional infrastructure. With enterprise data loads nearly doubling every two years, more and more organizations are starting to see the limitations of the typical three-year hardware refresh cycle.

So in the end, what is the right approach to storage going forward? In all likelihood, the industry is going to pursue a hybrid approach in the cloud, with much of the in-house infrastructure evolving toward integrated, modular solutions featuring a combination of on-server Flash and mixed SSD/HDD storage environments. The bottlenecks are likely to remain in and around traditional storage systems, but the good news is that high-speed applications will find it increasingly easy to avoid them.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


Add Comment      Leave a comment on this blog post
Oct 23, 2014 3:54 PM Brian Brian  says:
It's not a matter of storage availability-- it's more a matter of convenience and service at this point. Will be interesting to see how that aspect of the tech industry evolves. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.

Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.