Easing Up on Archival Cost and Complexity

    Slide Show

    Big Data: Not Just for Big Business Anymore

    Archiving has always been one of those functions that pulls the enterprise in two different directions. Increased data volumes, of course, require more storage capacity, but as data sits in the archives for longer periods of time, it loses its value. So in the end, the enterprise must devote more resources to constantly diminishing assets.

    Of course, this is the lifeblood of the archival management industry as numerous companies work up sophisticated algorithms and other tools to analyze data and then shift it from one set of resources to another based on its intrinsic value. The real purpose behind Big Data management, after all, is not to accommodate increasing volumes but to mine existing stores for gold and then store the rest at the lowest possible cost—or discard it altogether.

    Naturally, part of this process requires the development of low-cost media, such as tape, which offers the benefit of stable, long-term storage for data that is accessed infrequently. Disk-based archiving is also gaining in popularity, although this is primarily in tiered solutions, considering the disk’s relatively weak long-term reliability.

    Until now, perhaps. Sony and Panasonic have just introduced an optical storage platter that they claim has a life span of at least 50 years, which is nearly twice as long as even the most advanced linear tape-open (LTO) technology. Initial versions will offer capacities of 300GB, although the plan is to eventually field 500GB and 1TB versions that use multi-level recording and other techniques. The disk itself is about the size of a standard Blu-Ray device and does not require special temperatures, low humidity or other conditions to maintain viability.

    The device is due to be released in 2015, which means it could be a year or more after that before it makes a real dent in commercial data circles. In the meantime, companies such as DataDirect Networks are upping the performance of archival appliance solutions with new software and beefed up storage capabilities. The company’s new Web Object Scaler (WOS) 360 software, for example, now sports the Global ObjectAssure function that enables erasure coding and other functions for geographically distributed archives. At the same time, the WOS 7000 Archive Node holds up to 60 4TB SAS drives, plus 64GB of memory, with clustering capability of up to 256 units.

    Meanwhile, Fujifilm is doubling down on tape with the Dternity series, a NAS storage appliance that leverages the LTO 6 format to drive capacity up to 35PB. The system uses the linear tape file system (LTFS) to give the appearance of just another LAN-connected NAS device. This allows content to be accessed, shared and managed just like any other network file, essentially creating a blend of online, near-line and off-line capabilities, depending on application and user needs. The system can also be backed up to the cloud using the Dternity Media Cloud service.

    Meanwhile, Fujifilm is also forging ahead with entirely new tape designs. The company recently released a white paper describing its barium ferrite (BaFe) tape structure that is said to be more durable and provide greater capacity than standard metal particle (MP) solutions. BaFe is about a third smaller than MP, which means more data can occupy the same space, plus it has the side benefit of improved signal-to-noise. The tape is currently available under the NanoCubic brand name and can be found in the Ultrium 6 cartridge.

    At the moment, most archival loads are text-based, primarily email. As collaborative environments and mobile sharing technologies permeate the enterprise, however, expect to see more audio, graphics and video in the mix, which will push storage requirements to an entirely new level.

    Enterprises that have been loath to address their archiving needs or that simply push them off to the cloud may want to take a fresh look at what’s available in in-house solutions. A decent archiving platform will still cost a bit, but the price per gigabyte is steadily eroding while capacities and capabilities are on the rise.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles