The big news in storage circles for the past few years has been Flash technology. Now that it has been perfected to the point where it can withstand the rigors of enterprise-class workloads, Flash increases storage productivity by orders of magnitude, bringing storage speed and performance tantalizingly close to its processing and networking counterparts.
Cost-wise, however, Flash is still on the pricy side, which makes it less of a candidate for long-term, bulk storage applications. For these, we turn to disk and tape, which offer both low cost and reliability. Lately, however, a third solution has started to encroach on the relatively staid world of low-performance storage architectures: optical disks.
Sony and Panasonic recently made a big splash with the Archival Disc, a Blu-ray-based system that will range from 300 GB all the way to 1 TB in support of Big Data, cloud computing and a range of other applications. The chief advantage the Archival Disc has over traditional means of storage, however, isn’t cost or capacity, but longevity. The companies say the disks provide a 50-year lifespan and do not require special temperature or humidity conditions like tape and disk drives. Ultimately, then, the format is expected to lower overall archiving costs even if the drives themselves are pricier.
Archiving is more than just storage, however, so it’s worth a look at other aspects of a Blu-ray infrastructure to see if it provides a viable solution for the enterprise. According to Infostor’s Henry Newman, a big problem is the fact that Blu-ray only supports SATA or USB, neither of which are optimal for archival purposes. If you value features like end-to-end data protection and broad data search capabilities, it would be better to stick with traditional SAS and Fibre Channel solutions.
And as I mentioned a few weeks ago, traditional media is still seeing a healthy development cycle with companies like Fujifilm pushing LTO technology up to 35 PB in formats that provide smooth integration into legacy NAS architectures. As well, the company is pushing new barium-ferrite (BaFe) structures that improve durability and capacity and produce better signal quality.
Still, optical storage is attracting the attention of some pretty creative minds in the IT industry, so it is readily possible that lingering issues with networking, interfaces or the like will be resolved in short order. Frank Frankovsky, the leader of Facebook’s Open Compute Project, recently left the company to join a startup focused on optical storage technology. In fact, optical storage could become a prime component in the kinds of modular, commoditized infrastructure that the OCP is looking to perfect, provided key functions like read/write speeds and overall capacities can be improved.
The question for enterprise-class optical storage technology isn’t whether it has the right stuff to dominate the industry, but whether it offers enough value to add yet another tier to an already highly specialized storage hierarchy. Flash is clearly the solution for on-server or near-line architectures where speed and flexibility are top requirements. And the storage array will likely hold a mix of Flash and disk based on the needs of key applications like database support or various back-office functions.
Optical can certainly find a home in the enterprise, but its backers will need to illustrate key use cases that are uniquely suited to its talents. And that will be a much harder task than simply perfecting the technology for modern-day production environments.