The Data Deduplication Revolution
Data deduplication has evolved into an efficient, high-performance technology.
It may seem like deduplication is old hat for enterprises pushing toward virtual and cloud environments. But while the technology may be firmly established, deployment and configuration issues remain very much unsettled.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
Research from the Taneja Group indicates that nearly all enterprises employ dedupe on their backup infrastructure, even though less than half of all data is subject to it. That means there is still a large untapped market for dedupe products and services, provided vendors begin to address issues of data/application priority, wide area functionality and overall costs that still cause many enterprises to resist building out their dedupe capabilities.
A key issue is where and how to implement dedupe for optimal performance. Various approaches focus on the data source, backup software, target appliances or even the archive itself, all promising a range of price/performance benefits. For consistency's sake, dedupe should be as uniform as possible throughout the data environment and should employ a minimum of restoration, or "rehydration," as data encounters various transition points.
One interesting development of late is HP's StoreOnce system that provides the ability to appear as a source or target within the Data Protector environment, or as a target appliance to third-party software. HP says this kind of integrated platform should greatly simplify data backup architectures and is likely to be extended to primary storage solutions in the new year.
And while it seems that The Great Dedupe Wars of the past two or three years are over, acquisitions are still going on among mid-tier storage vendors. Imation, for one, recently purchased an online backup provider called Nine Technology primarily for its dedupe technology. Imation has been building up a repertoire of scale-out storage solutions for small and medium-sized business, so a block-level dedupe solution should dovetail nicely.
For the most part, dedupe is viewed as a means to more efficient storage utilization. However, some are arguing that it can enhance other functions as well. Gartner's Dave Russell, for one, says dedupe in primary architectures can improve virtual machine and virtual desktop performance due to the commonality of multiple images. Storage reductions of 99 percent are not beyond the realm of possibility, Russell says.
Dedupe may not be the most exciting technology to hit the drawing board, but it stands to make enormous contributions in the drive to scale up storage architectures for the cloud.
An update on last Friday's blog on the Defense Department's data center consolidation plans: It seems that the 2012 National Defense Authorization Act adds a range of reporting and compliance requirements that could slow things down significantly in the coming year. There's also language that instructs agencies to begin migrating from government facilities like the Defense Information Systems Agency (DISA) in favor of commercial services, rather than the "DISA First" approach that Defense favored. The bill has cleared both houses of Congress and is on its way to the president, who is expected to sign it. So it seems that the program won't be as dramatic as it first appeared.