Deduplication is rapidly transitioning from primarily a backup and archiving solution to take on a broader role in the data center. In the process, it is helping enterprises overcome one of the major challenges of the virtual era: rapidly increasing network data loads.
Largely through source-based dedupe, which implements data reduction before it hits the network, the latest systems are working hand-in-hand with data acceleration and other technologies to improve network performance and scalability while still fulfilling dedupe's initial role as a storage capacity optimizer.
A case in point is a new company called Infineta Systems, which this week unveiled the Velocity Dedupe Engine that employs parallel processing and programmable logic across deduplication, migration, replication and a host of other functions as a means to boost performance on 10 Gb SANs. The company claims it can offer a five-fold decrease in high-speed network traffic, even in rapidly changing virtual or cloud environments.
For some applications, however, the speed of deduplication itself is crucial. That's why EMC has turned to a pre-processor architecture for disk-based operations. The DD Boost system resides on a separate media server that acts as library used to cross-check data with what is already in the array. With only new data traversing the network, the company says it can boost throughput by 50 percent. The company says the system is compatible with Symentec's NetBackup and Backup Exec platforms, with plans to add the technology to the NetWorker line later this year.
Quantum is also on the hunt for Symantec customers as well, targeting smaller firms with the new DXi4500 disk-based system. The system is undergoing qualification for the NetBackup and Backup Exec systems through the OpenStorage API. The package offers up to 4 TB for as little as $22,500.
Even the more traditional data center environments are tapping into some of the latest dedupe advances. Luminex recently enhanced its Channel Gateway mainframe VTL lineup with dedupe and other features aimed at improving remote DR and business continuity capabilities. The system works hand-in-hand with a new Synchronous Copy tool that ensures data consistency across replicated storage by validating write operations at secondary sites, guaranteeing that all tape catalogues are consistent should the need for recovery arise.
Before long, it will be pretty hard to find an enterprise environment that does not deploy deduplication in one form or another. With the amount of data that's coming our way, there won't be much other choice.