So far, in my look at Linux compared to Mac and Windows, I've covered music players, photo organizers, and video editors. But all those apps – and all the documents they create – are lost if your hard drive crashes, your laptop takes a spill, or some other catastrophe strikes. If you have documents, you must have a backup …
Simple Backup Suite
You've missed one of the best -- Simple Backup Suite (sbackup). Especially good for backing up your personal machine to a USB hard disk (although it does work with network storage as well).
It's written and tuned for Linux, so the restoration works differently than a disk-image backup for Windows. Rather than restore a disk image (which can be problematic if you've just replaced a failed hard disk) you reinstall Linux and then use SBackup to restore your settings and files. Not copying acres of operating system files which are the same on every machine makes for much faster backups. SBackup does save your package database, so you do know what to reinstall.
Available for Ubuntu, although of course having a competing product Canonical don't promote it (which pretty much summarises the qualms the Linux community has about Canonical).
If you like rsync but dislike the command line then you might take a shine to Lucky Backup.
SBackup works fine.
Most of the time.
Thing that threw me was that it creates big tar files - and I was backing up to a FAT USB drive - files were > 4GB so were truncated and were invalid. Discovered before anything nasty happened. Also takes a long time to view/extract a large tar file
Now I simply rsync video and image files (which don't compress anyway) , and use SBackup for the rest of the home folder data. Keeps tar file size down.
Works for me.
Flyback project is dead, I believe. Back in Time works, but I had some kind of weird permissions issue that would hose everything up and I lost confidence. Simple works fine, but I can't choose to do hourly backups - it only allows daily as far as I can find. So I'm stuck with Simple. If anyone can suggest another Back in Time or Time Machine look a like for Ubuntu, please do.
"Trust" is not the only reason
"To keep things simple, we'll divide backup options into two camps: those that backup to a disk, and those that backup to web servers. For the latter, there is some degree of trust involved. While all of the options I've outlined offer secure encrypted connection, if you still aren't comfortable with the idea, then web-based backup services are not for you."
Web-based backups work fine so long as:
(a) The data you need to back up is relatively small; and
(b) You have an Internet feed that can handle the bandwidth required.
Personally, my new tinker-box was an exercise in creating a RAID array for ripping my CDs/DVDs to. It hold 4TB of data so far (14TB RAID5 maximum), so there is no way I will be doing a cloud backup any time soon.
4 Tb of data so far ? You have archived nearly 7000 cd ?
Reading for comprehension
The OP said CDs *and* DVDs. I also backup my DVDs to my server, for exactly the same reason that people rip CDs to MP3 - ease of access. My DVDs remain in MPEG2 format and average about 4.5 GB per film, or 6 GB per disk of a series. Hence it's very easy to hit 4TB with a mix of DVDs and CDs. If you add in Blu-Ray rips at approx 30 GB per film, your disk space is rapidly consumed.
So much for the "freetard" label...
> 4 Tb of data so far ? You have archived nearly 7000 cd ?
I have archived somewhere north of 1600 DVDs.
Just today I slapped in another 2TB drive. I shuffled some things around between my 2 "arrays" so that I would have the free bay in my main "Video Jukebox Array".
And if you really need a working backup - amanda
Well, the article misses Amanda and Bakula. Both have their religious fanatic camps. Being the follower of the former I have used it both on my home system and in my day job for network-wide backup since around 2000 or so.
It does not have a GUI and it has its own set of paradigms on how to back up data to maximise the probability of recovery. However, once you get used to it (and once it saves your b**t a few times) you do not want anything else. It can back-up to tape, disk, removable media, media changers and probably even a dead badger. It also supports encrypted backups so you do not need to worry who has access to your disks and cartridges.
Amanda irks me and so does Bakula. Neither work simple like Back in Time. But like I said, BIT had some issues I can't get past and so I'm stuck with simple backup.
Re: Amanda and Bakula
Both have Webmin modules. Not quite full-blown GUI, but better than nothing if you don't grok CLI.
I think I need an eye test...
...I read 'encrypted backups' as 'encrypted badgers'.
Ubuntu One + Windows
Think they are currently got an alpha in private testing for a Windows version of Ubuntu One.
Think it will go into beta sometime soon...
In my ~40 years of using un*x ...
... I have never (that's NOT EVER!) lost anything I have created. I have archives dating back to the very early '70s to prove it.
But then, I'm a sysadmin. I grok how (most) systems work.
"music players, photo organizers, and video editors" aren't part of the system ... they are toys for consumers that sit on top of the system ... and in the great scheme of things, are clearly useless when it comes to longevity of data.
I still have shell scripts based on tar & dd that I wrote a quarter century ago ... to date, they haven't failed me. More complex solutions (in my mind) demonstrate a lack of knowledge in the user-base.
please don't tell us how much you like Linux.
Back in my day....
.... nothing ever failed and we used tar and WE LIKED IT.
LOL, to funny. It's hard to drop a PDP11 when you can't even pick it up.....
>> "music players, photo organizers, and video editors" aren't part of the system ... they are toys for consumers that sit on top of the system ... and in the great scheme of things, are clearly useless when it comes to longevity of data.
Wow! I guess that you must not realise that here in the real world there are people that:
a) Have all of their music held digitally and for most of that the digital copy is the only version they have;
b) Use digital cameras and have thousands of pictures that have great value, either personal value (because your kid only ever had one 1st birthday) or commercially (because they are a professional photographer);
c) Make films and television programmes using computers - and losing one of those during production is more than a minor oops!
That's exactly what the person you quoted means. You're talking about files: these are easy to backup using a variety of utilities that have been around for decades. The fact they're music, video or photo files it entirely irrelevant to a discussion on backup.
Drop a PDP11 ...
... off the tailgate of the delivery lorry!
Luckily it was brand new so no data loss, and DEC's insurers dealt with the rest. Happy days.
It's not that nothing ever failed ... it's just that we took precautions with data. Which was my point.
And LSI-11 based kit was actually portable. For small values of portable.
Who is trolling? And where did I mention Linux?
"Luckily it was brand new so no data loss, and DEC's insurers dealt with the rest."
And it wound up back at DEC ... and in my Lab, where I didn't give a rat's ass about a little bent metal & chipped paint. New kit that had already been written off was considered an asset to everyone but the bean counters ...
@AC at 12:14
Then you misunderstood my post. What I was objecting to was the arrogant and offhand way that the applications, which are the very purpose of the systems that the quoted poster claims to administer, were dismissed as "useless toys". Contrary to the impression that many sysadmins seem to have (so wonderfully lampooned by the BOFH) very few business computers are actually bought simply for the pleasure of their sysadmins. They are bought to do income generating jobs (such as processing music, still imagery or video data).
What confuses me is why the poster saw fit to lay into the applications when they were only used to provide context in the very first paragraph of the article:
"So far, in my look at Linux compared to Mac and Windows, I've covered music players, photo organizers, and video editors. But all those apps – and all the documents they create – are lost if your hard drive crashes, your laptop takes a spill, or some other catastrophe strikes."
Backup? Um... No.
I'm not sure it's in any way accurate to call Ubuntu One usable for backup.
Ubuntu One synchronises multiple PCs, and synchronises your content into the cloud for access anywhere. But if you delete or overwrite something accidentally on your machine whilst it's connected to the internet, then a minute or so later it's toast on Ubuntu One as well. That's hardly a good backup solution...
It seems to me that you've fallen into what I like to call the "RAID trap" - people buying RAID arrays often mistake RAID 1 or RAID 5 for backup, when in fact it's merely providing redundancy against hardware failure.
The bottom line is that if your laptop died, Ubuntu One would keep your data for you. But it can't protect you against your own stupidity...
(And a clarification - I'm not railing against Ubuntu One, and use it myself for syncing machines - but I wouldn't use it for backup! But it's a good sync service and it's looking promising in terms of projects using it - for bookmark sync, data sync, configuration sync, etc., which is why I'm a paying customer of Ubuntu One.)
Dropbox is slightly better for backup - it holds previous versions for about 7 or 14 days (I don't recall which, exactly) and they offer an option for subscribers called "Packrat" which keeps previous versions of files indefinitely.
In the end, I picked SpiderOak for backup. They seem to be the most secure, and give me 100Gb for 100 dollars a year, which is a pretty good price. It has multiple versions, and is cross-platform with Win, Mac and Linux clients. I was surprised not to see it mentioned here.
For local or local-network backups, check out the rdiff-backup package. Rsync and diff, combining to give space-efficient backups - even to SSH hosts across the network. I've had no problems with it.
And for those who love rsync but hate commandline, there is grsync :)
deja-dup: where is it?
deja-dup is pretty much the standard for doing this sort of thing - why is it missing?
Nice. For "enthusiasts" (is there any other kind of Linux user?) there are a thousand command line tools. The ancient "cpio" does it for me. Rsync is a archive/mirror tool, rather than backup. It won't keep different versions of the same file.
rsync & versions
While rsync by itself doesn't keep multiple versions of files, it makes it easy enough, and a number of tools built on it do it automatically. A few have already been mentioned, others include rsnapshot (which needs very little besides rsync itself and perl, but requires editing configuration files manually) and backuppc (which provides web interface).
I'm also an rsync fan
You need a bit of scripting experience to make it easy to use, but if you can read man pages and write simple scripts, rsync will let you build exactly the kind of backup system that suits you best. The one thing that it doesn't do, as parent poster mentioned, is track different revisions within your backups, but that's easily done using other standard unix commands.
My current script creates a directory name with an embedded timestamp (yyyymmdd format suffices), finds the most recent timestamped directory in my backup directory (using the embedded timestamp rather than the directory's modification time, naturally), then does a cp -l to link all files from the old backup into the new backup directory, effectively copying the previous "snapshot", This only takes about a second for a large set of files. After that, an rsync from the directories I want backed up into this new directory will only copy files that have changed, so it's as quick as can be. Due to the hardlinking, I can keep a good few complete "snapshots" in the backup directory without needing much extra space (just directory overheads). If space becomes an issue, I can manually prune old snapshots simply by deleting the directories.
With a bit of logic to make sure the initial cp -i is only done once (and rolling back the creation iff it fails), I know that I can interrupt the script if I need to, whereupon the subsequent run will continue backing up files where the last run stopped since that's the way the underlying rsync call works. To be honest, maybe point and click backup solutions also do the "right thing" like this, but since rsync does everything I need it to do, and it can be trusted, I've never seen the need for them.
One last piece of praise for rsync: it has built-in support for doing backups over ssh, so if I wanted to, it wouldn't be too difficult to do all my backups securely over the LAN or Internet. In fact, though, with sshfs, it's not even necessary to update the script for simple user account backups... just mount it remotely and run the local backup script with the correct parameters. Simples!
Encrypted online backups
For those wanting to encrypt their backups and store them online, then SpiderOak works very well.
Cross-platform (Linux, Mac, Win and Solaris), various levels of encryption, supports multiple backup and restore options (web, dvd/hd by mail, etc). And you can send them your data on a hard drive (@neoc might require > 1). Supports snapshots, continuous backup and tons of other options. Plus you can choose to backup to CrashPlan's servers, another machine on your network, a friend in another city or all of the above.
Best of all, it's < $5/mo for unlimited backups and < $9 to backup all the machines in your house. And, if you are uber-paranoid, you can generate your own keys up to 448-bits.
No, I don't work for them, just a happy customer - http://www.crashplan.com
I spent several months evaluating most of the top online backup offerings. There are really only a small number that deserve any attention, and CrashPlan def. was at the top of the list.
Amazons S3 (Simple Storage Service) stores your data over 3 geographically diverse locations, so no problem if one of the AWS data centres burns down. There is a Firefox extension to manage your files although I use a command line tool called 's3-bash' created by Google to get and put files. Run it in a cron job.
Yes it's not very user friendly in comparison to Grsync or Back In Time, but it's advantages are cost and redundancy. $0.14 per GB of storage and $0.10 per GB data transfer.
I back up my web servers to it as well, which is ideal. So if a customer messes up there website or deletes all there IMAP email, I have all the historical data that can be restored for them. Used in conjunction with 'tar' to package files and folders first it's a winner.
If it has to alliterate
I'd go for prudent, not paranoid.
That's not to say there isn't plenty of other stuff to be paranoid about :-).
FWIW my weapon of choice is Bacula...it comes in the night and sucks the essence out of your pc. Slightly geeky, but once you have it running you don't have to worry about anything. Especially the bootstrap file is really powerful in case things ever completely break down. And we all know they will at some point, don't we? Hmmm, maybe paranoid after all.
Penguin, because...what else?
Glossing over the details vs. providing tools...
Stuff like Time Machine it's nice and fancy but it's ultimately more version control than a proper backup. A proper backup includes multiple copies with some of those residing off site. Genuine redundancy ensures survival of at least one of your backups and also makes it more likely that one of your backups will be usable.
My "small stuff" is replicated to every machine I have that has the space. Given the ever growing size of drives, that's pretty much any device in the house that doesn't have an Apple logo.
The "small stuff" even ends up on the Archos.
And lets not forget ...
While some purists will be spitting their lunch at the screen, there are commercial backup packages that support Linux. Retrospect does so very nicely - although they only officially support a limited number of distros.
homebrew solution for large archive data
Some time ago, I developed some utilities for creating RAID-like "shares" from a single file, with the property that any m of the n shares created (m <= n) could be recombined to recover the initial file. I was initially most interested in using it for security (since if you have fewer than m shares, you can't read the file), but latterly I've been more interested in applying it to create RAID-like data integrity.
Although I haven't yet built an app to do this, the basic utilities (and RAID-like techniques in general) could be useful for splitting up large archives of relatively static data (eg, video and sound files) into shares and distributing each of those shares to different locations, whether that's just on different disks or computers or different sites. In the event that the original file is lost, so long as m of the n shares still exist and can be accessed, you can recover the original file.
There are a few advantages to this kind of technique...
a) shares take up 1/m of the original data file size locally (and n/m globally) so each backup site doesn't need to bear the full weight of the backup costs (whether in terms of storage or bandwidth);
b) there's inherent data security in that breaking into one of the backup sites doesn't get you access to the data--you'd need to break into m of the sites;
c) there's inherent data integrity--you'd have to lose (n-m+1) of the shares before your backup was completely lost;
d) as opposed to regular RAID systems where all your disks are physically in the same location, you can distribute shares to distinct sites, improving physical security since a single controller failure or office fire doesn't destroy the entire backup.
I still find it fascinating to think about how the fundamental idea of RAID--creating redundant information that lets you recover the original data even if some number of the shares are lost--works. I think we're missing out on some tricks by having it only work at the level of the disk controller, and I think in future we're bound to see some more clever net-based distribution/backup models. You can find my (relatively modest) contribution to this idea (the Perl/C scripts/libraries for splitting and combining files) at http://sourceforge.net/projects/gnetraid/. I hope some of you may find it interesting and/or useful.
I use Dropbox on a few different PC's, one Linux, and two Windows, as well as my Android phone. Works flawlessly for me... but I could always do with some more space. Queue shameless referral link plug.
dump 0f /path /dev/disk
Am I the only person that uses dump/restore instead of tar or cpio?
I feel a bit dirty for reading this
Seriously, reading the article and the comments I wonder how no-one suggested emailing your files to your Gmail account as a form of backup.
There are online backup solutions, but dropbox ain't one of them. It's primarily a sync/share type of thing for people who can't use ftp. Of course you CAN use it to "backup" data, the same way as you can email said data to your Gmail or Yahoo! Mail account.
Same goes for a lot of things mentioned in both the article and the comments. Basically, if you have routine write access to files on your "backup", then it's not a backup but a sharing/syncing thing, as pointed by jake and Philip.
Me, personally ...
I find tar suits me just fine, but BackInTime seems to work too.
Prefer Dropbox over Ubuntu One but things may change as UO progresses.
Dropbox IS good, though. Very handy and its restore/undelete feature is a real lifesaver. Its also nice to have my Linux laptops and OS X machines be able access common files and synch with each other.
Wi$$$h I'd invented it.
If anyone has heard of DVD-Rs
Once your files get to that 4 gig limit..... burn to DVD
Lets face it, once you've ripped and saved the best of Black Sabbath, they are'nt going to issue new versions of those albums
Same goes for photos, burn them to disc and label the discs
And for those really really unlosable files (ie your pr0n collection) buy another hdd
- +Comment Trips to Mars may be OFF: The SUN has changed in a way we've NEVER SEEN
- Vid Google opens Inbox – email for people too stupid to use email
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...
- RUMPY PUMPY: Bone says humans BONED Neanderthals 50,000 years B.C.