Ten Ways to Improve Data Backup
Every aspect of the data center environment can stand a little improvement. But if your backup capabilities are like most, they are in dire need of an upgrade.
I've written a number of blogs in the past on the topic of data backup, such as "SMBs' Common Mistakes with Backup" and "Implementing Backup for Your SMB." I'm acutely aware that best practices do evolve over time, however, and have always been open to new ideas or suggestions when it comes to protecting one's business to achieve maximum data survivability.
Well, Matthew Dornquast, CEO and founder of Code 42 Software, sent me a number of tips to help prevent data loss earlier this week. Code 42 Software is the creator of CrashPlan PRO, an award-winning onsite, offsite and cloud backup solution for small and mid-sized businesses. The tips caught my attention for their refreshing perspective and ideas offered on the data backup front. Beyond obvious causes of data loss such as accidents, theft and breakdown of IT equipment, Dornquast believes human error is a huge factor that is often overlooked. Accordingly, his tips deal with ways to mitigate this problem.
I sum up the key suggestions below.
Dornquast recommends making multiple backup copies of data, preferably storing them at different geographical locations. He warns against the temptation to "make backups of backups" out of convenience, however, observing that any corruption that may have occurred during transmission would simply be replicated in subsequent copies. "We advocate a system where each different backup destination, whether it's a local hard drive, another computer or the cloud," said Dornquast in his email to me. He elaborated: "That way if something is wrong with one backup, you've got the others to fall back on."
Multiple versions of a file should be backed up observed Dornquast, given that this is a common capability in backup software these days. For those who are unaware, versioning essentially allows users to roll back to any prior version of a document, which can be invaluable for work documents. Dornquast further added that the best systems allow users to specify the frequency with which a new backup is made, as well as the number of revisions that are retained. Having encountered instances in the past where I've deleted sections of my work by accident, this is a capability that I'm fully in favor of.
The value of data retention
Most backup systems retain copies of deleted files for some period of time prior to deleting them, with 30 days being a typical configuration. Dornquast thinks this is bad, given that data may still be irrecoverably lost if files that were accidentally deleted were not identified within the configured timeframe. A better solution, he postulates, would be to retain deleted files forever - unless specifically tagged to be purged, that is.
Syncing is not the same as backing up
While sync is great from a usability point of view, Dornquast is of the opinion that this is prone to data loss as a result of human error. I think nothing underscores this point better than the Gmail outage earlier this year; a bad software update inadvertently corrupted and deleted email messages belonging to thousands of Gmail users. The presence of multiple synchronized data repositories proved no defense, necessitating that Google tap into offline data backups to recover affected data.
As Dornquast puts it: "The bottom line is that sync can be a great complement to backup, but is not a substitute because it cannot protect you from data loss that is the result of human error." Moral of the story: No matter how good your sync software is, remember to make backups, too.
Do you have any comments or tips of your own? Feel free to voice your opinion in the comments section below.