Automated, Hands-Free Backups
We're all lazy, but we all want to be sure that our important documents are safe from damage. Here are a few solutions to that problem.
Update: March 15, 2005
I've found a backup solution that meets all of my criteria. Read: SyncBackSE
You know the drill, everyone has important information on their computers, and everyone wants to make sure it's as safe as possible, but nobody wants to sit there for hours every week (or minutes every day) with CD-ROM's, or even being forced to remember to click an icon every other day. Here are some programs that will remember to do it for you. Sorry, Windows only.
This freeware program will copy files from one place to another on your computer (or across a network), and even ZIP them in the process. Plus, it will FTP files onto a server for you - something I haven't found in any other program.
I have 2 profiles in this program. One of them take some of my most important files (based on their directory) and copies them to another folder, zipping them in the process. The second takes those zipped files, and once every eight hours, FTP's them on to my server for me, into a web-accessible, but password-protected directory.
This means that in the case irreparable damage occurs to my computer, and even my CD-ROM collection of backups, I can still retrieve my most important files from anywhere in the world.
You can set it up to run via Windows Scheduler, or simply to run every xx minutes/hours/days within the program.
12Ghosts Backup ($30)
I've been using this program for years. Not only will it take any file or directory and back it up to another directory (again, either on your computer, or across a network) on a schedule, but it will create a running archive of your files so you can retrieve a previous version - without you having to think about it.
It will wait for you to modify a file and (after waiting a certain period of time so it doesn't make frivolous backups if you're saving every 2 seconds) make a copy of that file, adding a time stamp to it so you can retrieve an old version.
Now this sounds like it would take up a ton of disc space, but the brilliant part is, it will save 1 copy per minute for an hour, 1 copy per hour for a day, one copy per day for a month, and one copy a month forever. So yes, it takes up a lot more disc space than just a single backup, but it slowly gets rid of older versions until you're only left with 1 version per month. This way, you can retrieve just about any recent file, and you can go back several months.
I bought this software inexpensively years ago, but the price has gone up. To purchase this product individually costs around $30.
I've seen this program recommended more times than I can remember. I've only played with it a little, but it looks excellent. It will compress files for you, and maintain up to 25 previous versions. Unfortunately, it won't do both at the same time, but you could work around this using multiple profiles. Like 12Backup, it costs around $30.
12Backup's backup scheme is probably overkill for most people (unless you're constantly modifying files and absolutely must be able to retrieve a version from 5 minutes or 5 months ago), and wastes a lot of space because it doesn't compress the files. Simply keeping the previous 25 versions is probably enough.
Unfortunately, this program won't FTP the files for you like SyncBack, but it will allow you to run a program before/after running a profile, so you could - if you were technically adept - set up a script, or configure a program to FTP to your server based on some command line input. Or you could combine it with SyncBack (which is free, after all) to FTP for you.
Karen's Replicator (Free)
The simplest of the tools I've come across in this area, this tool will simply copy files to another directory, drive, and (of course) another network drive on a schedule.
If you're just looking for a simple backup program, this one may be the one for you.
This one is a real "geek only" tool. Similar to rsync, this command line tool will synchronize files across computers or drives. Unlike rsync, Unison is cross-platform and will work on *nix variants including OSX and Windows.
Thoughts on automated backups
In his book Human Error, James Reason discusses catastrophic failures in battle, and at nuclear power plants. Backup systems that depend too much on everything going just right (tightly coupled) are destined to fail. Loosely coupled systems that will still do what they're supposed to in the event of Murphy's Law rearing it's ugly head are much better.
Automated backups are a good thing, but many of these programs suffer from some major design flaws.
If your backup program maintains a single backup (no versioning), then a corruption of your original file means automatic corruption of your backup. This is just as bad as having no backup at all. It means that unless you notice a file is bad in between the time it goes bad and it gets copied, you're screwed.
Similarly, if your backups are not kept on a separate computer, or at another physical location, one catastrophe could wipe out everything you've created. Losing your computer to fire, water, theft, power surge, etc. leaves you without an archive of your essential data, and again, is just as bad as not having any backup at all. Even across thousands of miles, a virus could infect files on a network drive backup.
On a really paranoid level, x number of previous copies might not be enough if you don't realize your data is corrupted through all x cycles, but if you're worried about that, it may be time to look at a more industrial strength solution.
The ultimate automated backup tool doesn't seem to exist (and if it does, please tell me). This ultimate tool would have a combination of features found in the above programs.
- Compresses backups.
- FTP's the backups so you can get them off site.
- Keeps several versions
- Local versions should focus on the most recent changes rather than archival. This way you can quickly grab a copy if you see something nasty happen (like you delete several paragraphs just before hitting save). If you need to grab an old version, you shouldn't mind waiting to get an FTP.
- Remote versions should have current, previous, end of last week, end of last month, and perhaps monthly if you have the space to spare.* The files should have the date in them, and FTP'd up that way so that a failure to upload one time won't end up in the corruption of the backup. A log file should be kept so the program will recognize if something went wrong last time and not assume the previous files are valid. (* This feature should be highly configurable, I don't want to dictate when I think backups should occur on someone else.)
Some of you might think this is overkill, but I think versioned, off-site backups are your best assurance against catastrophic failure.
This system could be fairly reasonably mimicked using SyncBack. The more clever among you can distill this to fewer steps, and if bandwidth isn't a concern, it can be distilled even further, but the basic concepts are as follows:
Create 4 profiles
- Copies your files to backup1.ZIP every other day.
- Copies your files to a backup2.ZIP on days the first profile doesn't run.
- FTP's backup1.ZIP soon after profile 1 runs.
- FTP's backup2.ZIP soon after profile 2 runs.
You can expand this as much as you want, creating 7 daily backups (monday.ZIP, etc.), weekly backups (weekly.ZIP), monthly (monthly.ZIP), or even 12 monthly backups (january.ZIP), but at this point, if this data is that mission-critical, "automatic" and "hands-free" shouldn't be terms you're using to describe your backup process.
Of course, none of this protects you from the simplest of failures - for some reason your backup software dies or doesn't start up, or gets manually closed and never opened again. It's all a matter of knowing what you're comfortable with and planning accordingly.
- DFIncBackup ($12.95) looks interesting. It does incremental backups by zipping only the files that have changed. DFIncBackup Home version is free.
page first created on Sunday, October 03, 2004
© Mark Wieczorek