Quantcast
Channel: Geekzone forums: Cloud, SaaS
Viewing all articles
Browse latest Browse all 472

Musings on rolling your own remote "cloud" backups and general backups

$
0
0
Thinking out loud. Comments welcome.

CrashPlan

My annual CrashPlan subscription is coming up for renewal. It costs something like NZ$70 for unlimited storage, which seems like a good deal, especially coupled with my unlimited broadband. CrashPlan also lets you backup to local disks. Support is good, it's reliable, doesn't seem to slow my PC, and they have a data center in Australia so it's pretty quick.

The client gives you quite fine grained control of versioning. For example you can keep versions every 2 hours for a week, then after the version is a week drop to daily, after a month take it to weekly, etc. That keeps key information but minimises disk space required.

Cloud Storage Pricing

It occurred to me that I store around 100GB of data in online backups, maybe rising 50GB per year (family photos and video). 200GB storage in AWS Glacier in the cheaper US regions is US$0.83/month, which is US$10 per year or about NZ$12/year. 1TB/year costs about the same as Crashplan. AWS Sydney costs about 20% more than the USA regions. In AWS three copies of the data are stored, it's checked for consistency, and bad blocks replaced. It's reliable. Getting data there is the main issue.

Backup Integrity / Trust in Tools

One of my main backup tools is Cobian Backup. I like that Cobian copies whole files that you can access directly in the backup file system, so restores are trivially easy. You don't get compression, and incremental backups just make another copy of the file, wasting space. It's also pretty long in the tooth, with no active development and no releases since 2012.

Also, I have terabytes of files, images and videos, and incremental backups are essential to protect against viruses and crytoware. A small defect in one file in the chain could potentially mean you can't restore your backups. I guess you have hope your tool can tolerate small errors, and ideally use a reliable file system.

Backup Tools

Duplicati and CloudBerry Backup both backup to Amazon S3 easily, and can backup to a huge variety of targets (AWS Glacier, RackSpace, FTP, etc).

You can of course store data directly in S3 or Glacier, but the user interface isn't so good.

Duplicati

Duplicati doesn't have direct support to backup to Glacier - you can fudge it using Lifecycle rules and some settings, but it's not ideal. So with Duplicati you need to use S3 infrequently access storage class, which costs 3X what Glacier costs, but gives you instant access to your files.

I had a play with Duplicati 2.0 experimental yesterday before I realised it didn't support Glacier directly, backing up to both S3 and local disks. It seems like a nice tool, and runs on multiple platforms, and gives you fine grained file / folder selection. It's also under active development, unlike one of the backup tools I use, Cobian Backup.

CloudBerry BackupThis seems like a nice piece of software. It can back up to just about anything - S3, Glacier, Azure, Google Cloud, FTP, local disk, etc. It's commerical with a free tier for home use, so problems may be fixed more promptly than open source systems. It supports file versioning with plenty of options, though not quite as many as CrashPlan. It runs as a service, so it can backup when you're not logged in.

With CloudBerry to get compression you need to pay the $30 license fee (comparison chart), but the free version backs up without compression. Given most large files are compressed (mp4/jpg/RAW files) compression doesn't seem essential.

If you use AWS S3 as your backup target you can choose to have compressed, incremental backups in large archives or individual files stored in S3 so you can easily access them directly. Individual files is more convenient, but in an archive reduces costs through compression and fewer requests. You could backup some files to Glacier, some to S3, depending on whether you need to access the files from multiple locations.

Rolling Your Own Cloud Backups

With current data volumes, I could save some money moving to AWS Glacier, with some files in S3 cheaper tier (IA). However it would take some time to set up, test, and maintain. If I wanted to save $50 a year, then I think CloudBerry backup with AWS Glacier would be a good way to go.

CrashPlan at $70 with unlimited data storage and versioned backups is probably good value, taking into account setup and maintenance time.

Local Backups

I think I might give CloudBerry a try for my local backups, which I store both onsite and offsite. It seems like a nice tool.

Question

What's your favorite software to do backups to local disks? Incremental backups are essential IMHO, to protect against viruses and cryptoware. Mirrors aren't a backup.

Viewing all articles
Browse latest Browse all 472

Trending Articles