No matter where or how data is captured and processed by the enterprise, the requirements for storage remain the same: Keep it safe, keep it accessible and keep it cheap.
While much of the storage load is heading toward the cloud where costs are likely to remain historically low for the foreseeable future, many organizations are still eager to maintain their own storage infrastructure based on high-speed Flash systems. But all Flash is not created equal any more, and with many data loads requiring highly specific storage requirements, the selection of any one Flash solution is not as simple as it used to be – and that is even assuming Flash is the appropriate choice in the first place.
The future of Flash in the enterprise could, in fact, become a make or break development for the storage industry, particularly for market leader EMC as it seeks a new existence as part of Dell. The company anticipates that by the end of this year, 40 percent of the enterprise industry will have shifted to all-Flash in their primary storage operations. This means the company has a lot riding on its VMAX platform, which recently underwent a substantial software recode to make it more durable and more manageable as customers look to scale their storage infrastructure into the cloud.
Indeed, much of what people are calling the fourth generation of Flash has little to do with the medium itself but the software surrounding it, says Computer Weekly’s Bryan Betts. While initial iterations of Flash served as point solutions or simple replacement of hard disk infrastructure, the latest platforms are starting to drive functionality that is not available on other forms of storage. Earlier systems, for example, refrained from writing to disk as much as possible, preferring to cache data in memory to avoid I/O latency and bottlenecks. Flash does away with this artifact, and as such can be applied more directly to high-speed analytics and transactional workloads.
Flash is also moving ahead on some of its key deficiencies, namely endurance, says The Register’s Chris Mellor. A small Irish startup called NVMdurance says it can extend the life of SSDs using an online, autonomic controller that uses advanced algorithms and data navigation to determine optimal storage patterns to ensure maximum NAND lifecycles. The system incorporates a high degree of machine learning so it can continuously adapt to changing performance characteristics as the modules progress through their lifecycles. The company says it can improve NAND endurance by a factor of 10, and has inked a supply deal with Altera to deploy the system in its FPGAs, meaning it will likely show up on an Intel chipset before long, considering Intel purchased Altera for a cool $16.7 billion late last year.
Of course, Intel is already experimenting with advanced memory solutions that have Enterprise Storage Forum’s Paul Rubens wondering if any investment in Flash technology will be obsolete in a few years. The company recently teamed up with Micron to develop something called 3D XPoint that is said to be 1,000 times faster and longer-lasting than Flash and even 10 times denser than DRAM. This leaves some to question whether it will be better suited as a faster but more expensive alternative to NAND or a slower, cheaper but non-volatile replacement for DRAM. In all likelihood, the decision will come down to user requirements. Micron, for one, anticipates initial use cases to revolve around data analysis and warehouse, as well as online transaction processing and virtual infrastructure support.
It’s been clear for some time now that storage is no longer just a question of cost vs. capacity. Modern workloads require highly specialized functionality and architectural support in order to provide optimal user experiences. In many cases, decisions regarding the exact type and quantity of storage will be made by automated systems, with users simply concerning themselves with service levels.
But while much of the bulk storage activity will take place on the cloud, the enterprise will likely reserve key storage infrastructure for itself to keep critical data close to the vest. On this tier, of course, the enterprise will need to consider carefully what type of storage media to deploy and how it should be supported by ancillary hardware and software.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.