Data backup is a fundamental part of nearly every organization’s disaster recovery and business continuity plans. But long gone are the days when you had a simple tape backup drive connected to a computer with a single tape that would back up days and weeks worth of data to a single tape.
The volume of data organizations back up is enormous and growing larger all the time. Different strategies have been used by companies to back up their data. Larger, more sophisticated tape backup systems are often used. More progressive organizations have employed remote data backup services where data is backed up over the Internet to a secure facility. Regardless of which method companies use, Perimeter CTO Kevin Prince says there seem to be some common pitfalls that can severely affect a company’s processes and bottom line.
Using an up-to-date remote data backup solution with such features as file de-duplication, in-file de-duplication and compression will not only solve the problems of traditional tape backup and older remote data backup solutions, it will save you thousands of dollars in what is ultimately a far superior solution. If you got burnt when trying a previous iteration of remote data backup or still use a tape backup system, now is the time to evaluate a better solution in order to avoid these common pitfalls.
Be sure to check out these other Perimeter slideshows:
- 8 Elements of Complete Vulnerability Management
- Data Breach Trends of 2009
- Tracking Data Breaches by Industry
- Top Security Threats for 2010
Click through for common pitfalls found in traditional data backup methods.
Data backup tapes are built with magnetic media. Magnetic media is extremely susceptible to corruption and exposure. Most systems that rely on magnetic media are designed to detect these corrupted areas, mark them as “bad,” and not use them in the future. But with backup tapes, the data is usually saved to the magnetic media and then exposure, a fingerprint, getting too close to a magnetized area, or any of several other things occur and the data on the tape becomes unusable, and you don’t ever know it, or at least not until you try to restore and find it isn’t there. There is a significant failure rate on magnetic media and yet people largely don’t realize it because they just assume the data that is stored on their tapes is safe and sound.
Another poor assumption that IT administrators often make is the length of time it will take to restore data. Some data backup systems make this process difficult because of the need to recatalog data sets and determining the correct tape where the data lies. After these steps, the restore occurs (you hope; see Pitfall #1). But that assumes the tape backup you need is sitting right there. Disaster recovery plans usually call for data to be taken offsite. So there are times, especially after a business disruption event, that it may take days to get the tapes back to even begin a restore procedure.
The number one cause of data security breaches in the U.S. is theft. A significant percentage of theft is associated with lost or stolen data backup tapes. There are several weak links that often lead to theft of data backup tapes. First is when they sit out or near the server or backup unit itself and can be snatched by thieves. Second (and by far the bigger issue) is when data backup tapes are transported to offsite storage locations, especially by third-party providers. There are many known data breaches resulting in class-action lawsuits when a third-party provider ‘misplaces’ backup tapes. Third is the storage itself. Many smaller organizations, and some larger, will have an employee take the tapes home rather than paying for third-party transport and storage.
Most organizations do not properly size the data backup system when it is initially purchased. They don’t (and frankly can’t) anticipate the growth in the amount of data they will be required to archive. This results in an initial capital expenditure that becomes quickly outdated and new investment is required.
Natural disasters and business disruptions occur every day. When a major disaster happens, especially one that affects an entire region, your data may be safe and sound in an offsite storage facility many miles away. You might be able to get the backup tapes back in a few hours, but will you be able to restore the data from those tapes? If the backup unit itself was destroyed or damaged during the disaster, you will have to find a replacement unit prior to being able to restore the tapes. Often after a disaster, compatible backup units are either very difficult or impossible to procure.
Many data backup strategies are employed without a clear knowledge of what the real costs are going to be. This is especially true of remote backup systems where service providers charge by the gigabyte stored, yet the amount of data can fluctuate greatly and is usually growing all the time. Earlier on, few remote data backup providers permitted using local storage as part of the overall backup strategy. Combine that with high prices for stored remote data and many were turned off from the early implementations of these services.
Early remote data backup solutions also did not do a good job regarding bandwidth management. These solutions would attempt to send so much data through the Internet connection that other mission-critical applications would be disrupted. In other words, the process to facilitate business continuity would be the very thing that would cause a business disruption. Lack of scheduling, bandwidth throttling, good compression, and de-duping strategies all contributed to a reduction in Internet performance.
Not enough organizations encrypt the data on backup tapes. If tapes are stolen or lost, data breach disclosure and notification laws exist in 46 states requiring the company to publicly announce the incident. There are even four states that require disclosure even if the data is encrypted. When using a remote data backup strategy, often these solutions don’t have a flexible encryption key management.
Both tape backup and remote data backup solutions are notorious for difficult decentralized management. Often the backup software would have to be installed on each system you are trying to backup and would create individual system vaults that are difficult to track and use during a restoration. Keeping track of unsuccessful backups from multiple machines makes management impossible.
With first-generation remote data backup systems as well as tape backup systems, many operating systems, databases and applications required an agent to be installed on the system to capture complete backups. Sometimes this software would cause problems on the system because it conflicted with other critical programs. Also, there would always be some critical system that wasn’t supported and therefore couldn’t be backed up using your enterprise system. This would require a special out-of-band human process causing many issues.