Five Top Data Center Protection Challenges and Best Practices for Overcoming Them

Joe Forgione
SEPATON recently conducted a survey of large enterprise IT managers to understand the data protection challenges that most concern them. Not surprisingly, the survey showed that the volume of data under protection continues to skyrocket. As a result, large enterprises are looking for ways to mitigate the cost, risk and complexity of data protection throughout their enterprises - including data centers, disaster recovery sites and branch locations. That data growth is driving them to re-examine their current data protection environment. Here is a look at the top five challenges facing large enterprise IT managers and the best practices for overcoming them.

Challenge #1: Control skyrocketing data growth

Enterprise data volumes continue to grow at a staggering rate - more than 30 percent compounded annually, making data deduplication an essential component of any data protection strategy. However, there are several types of deduplication technologies - those that deduplicate at the source and those that deduplicate at the target. The 'target'-based deduplication options are further divided into those that deduplicate inline before data is written to disk and those that deduplicate concurrently.

To be effective, enterprises need to match the type of deduplication to the environment. For example, source deduplication is a good choice for small data volumes, in low-bandwidth environments when there will never be a need for wholesale data recovery. It offers a simple 'lightweight' strategy for getting a small measure of data capacity reduction. Inline deduplication technologies can handle medium-sized backup volumes in environments with more bandwidth. However, inline deduplication slows performance over time and cannot scale to handle larger data volumes or to enable simultaneous replication or recovery. It also provides limited capacity reduction because it cannot identify duplicate data in large databases or other applications that store data in <8KB chunks. 

Concurrent deduplication technologies are best for large data volumes (>5TB/daily backup). This type of deduplication enables the target system to use multiple processing nodes to back up, deduplicate, replicate and restore data simultaneously. As a result, it can deterministically back up and deduplicate very large data volumes without a backup performance penalty. It also restores the most recently backed up data without needing to reassemble it for fast data recovery and vaulting to tape.

Challenge #2: Eliminate data center 'sprawl'

Many enterprises have had to add numerous disk-based backup target systems to stay ahead of growing data volumes. Every one of these 'silos of storage' not only consumes more space, power and cooling in the data center, it also adds significant administration time, increases complexity and risk of human error, and reduces deduplication efficiency.

To avoid data center sprawl in the data protection environment, IT managers should look ahead 3-5 years and choose a data protection 'target' system that will scale to accommodate the performance and capacity they will need without adding new system images. This best practice eliminates the need to over-buy by enabling IT managers to modularly add performance and/or capacity as needed. It also saves thousands of dollars in administration time by eliminating the need to load balance and tune new systems needed for scaling. Thirdly, by consolidating data protection in one system, this strategy enables significantly more efficient deduplication.

Challenge #3:  IT administration and staff time is at a premium


Efficient automation and management control are becoming critical requirements for today's large enterprise data center.

Data protection IT administrators have more data to protect and more complex data protection standards to meet while staying within a limited budget. They need to invest in systems that automate disk-subsystem management, reduce complexity, and provide effective dashboards and reporting. Minimum requirements for a large enterprise data protection platforms include:

Automatic load balancing and tuning. As enterprise IT managers add capacity or performance, the system should bring it online and perform load balancing and tuning automatically.

Automatic system monitoring and 'phone home' functionality. Choose systems that pre-emptively identify potential hardware issues and notify administrators before a problem arises. The system should also automatically send system efficiency data to the vendor for ongoing analysis.

Provide dashboards and reporting. Choose a system that enables IT managers to monitor the status of all data protection operations - backup, deduplication, replication and restore throughout the enterprise from centralized dashboards. The system should also provide detailed reporting that enables administrators to track and project future capacity and performance needs.

Challenge #4: Enterprises increasingly rate their own disaster recovery and branch office data protection as inadequate

In our survey, enterprise IT managers rated 'data will be unrecoverable in the event of a disaster' as the greatest backup/data protection fear. They also expressed concerns about their company's ability to protect data in branch locations. Large enterprises that have been using physical tape backup systems and disparate disk-based solutions in branch offices are particularly vulnerable to downtime and data loss in the event of a disaster. Enterprise IT managers should consider the use of a consistent platform and a 'hub and spoke' topology for their data protection environment. In a typical implementation, disk-based backup systems are used in branch locations that replicate data to a large central data protection platform for disaster protection. The central system may then replicate to a disaster recovery location for centralized DR protection or centralized tape vaulting.

This strategy enables IT staff to manage remote-office backup, deduplication, replication and restore operations from a data center headquarters. It ensures that the same standards of data protection and recovery (RTO and RTO) objectives are implemented across the enterprise. It also minimizes the burden on IT administrators in branch locations and provides administrators with a company-wide view of data protection efficiency.

Challenge #5: Adopting new data protection technologies without costly, risk-prone 'rip and replace' migration

With limited budgets and resources, IT managers have worked to protect their investment in existing technologies. The cost and risk of migrating to a new technology - particularly when that migration requires a wholesale 'rip and replace' - often outweighs the potential benefits. IT managers should look for enterprise-class data protection solutions that mitigate these costs and risk with features such as robust tape emulation and storage pooling. Solutions that emulate a wide range of physical tape and tape libraries as well as non-tape formats, such as NetBackup OST can be added to an existing environment without disrupting ongoing operations. Storage pooling allows IT managers to separate existing backup environments (such as Fibre Channel/tape backup) from new technologies (10 Gigabit Ethernet/OST). New backups can be moved to the new technologies and older backups can be expired off the older technologies over time in a seamless, low-risk approach.

Summary

As data volumes continue to climb, enterprise IT managers need to rethink the current strategies for providing data protection throughout their organizations. New technologies can deliver a higher level of data protection while saving significant administration time and opportunity for human error. IT managers should avoid technologies that cause system sprawl, added complexity or 'rip and replace' approaches. 
 



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.