Tuesday, April 28, 2009

File Managers

I'm planning to update to the latest just-released version of Ubuntu. Since there is a small but significant chance it will not be an improvement, a sensible thing to do is to back up my personal files to make it easier to go back. The easiest brute-force method is to hook up an external drive and just copy over. Time consuming, but my part of it is relatively short--hook up the drive, make sure there is room, and copy. Come back in a couple hours. In theory.

My home folder has more than 50,000 files, and is over 50 gig. Some of this is waste--I've got a copy of an old home folder which nearly doubles the size. With this many files, it is nearly inevitable that there will be at least one that the computer can't copy for one reason or another. I have no problem with this. My problem is that the computer will invariably use that as an excuse to stop doing anything, and ask you how to handle the error.

Linux is better than Windows here--I've never figured out exactly what "yes to all" or "no to all" means in Windows, since it would often ask the same question over and over even when selecting one of the "to all" answers. I suspect it was something to do with the directory a file was in, but I never bothered to investigate much. Linux at least has the same definition of "to all" as I do.

The absolute linear flow of all the file managers I've used bugs me. When it runs into a problem with a particular file it should of course throw up an error, but I don't understand why it can't continue on with the files it can deal with, and deal with the error later once I've given an answer.

What I should do is use proper backup software--I'm likely going to do that once I decide if the new version is acceptable. I'm already looking into which ones to use, but from the descriptions it is hard to tell if it meets my needs. I need the ability to do either incremental or differential backups (only what has changed either since the last backup, or since the last full backup) and the ability to back up to a USB drive. Backing up my wife's Windows box would be nice, but not essential, and backing up TO it automatically would also be nice but not essential. I don't need enterprise features, I would like a GUI, and I want some notification if there's a problem. (Apparently Sbackup failes this last step--if it fails to back up, it doesn't notify you)


  1. NFS mount the remote disk and do something like:

    $ tar -cvf /remotefilesystem/filez.tar

    There's a switch to force the file into the archive even under error (or continue with other files, I can't remember) but my tar-fu is old and weak. But man will tell you.

    BTW, we got a Western Digital 1 TB "My Book" network drive. Plug the Ethernet in, and it's up and running. Ubuntu works cleaner with it than Vista.

  2. I know there are ways around this stupid behavior, I just don't think I should need them. I understand the real world is different though....

  3. i use a fancied-up twist on the "tar" method, a program called "dar". it's in synaptic, of course.

    advantages over the old "tar": automatic, per-file, configurable compression; backs up extended attributes (very handy if you're backing up a Fedora/SELinux disk); handles incremental and differential backups sensibly; faster seeking through multi-gigabyte backup files; splits archives into several files if they reach a configurable size, automatically, and handles such files seamlessly; better testing of backup integrity. oh, and it handles errors reasonably, just as it should.

    advantages it conspicuously fails to have: a GUI; a menu-based UI; in fact, any interface even slightly friendlier than "tar" had back in the 1970s.

    it could probably back up windows, too, at least if you can samba-mount the drives to be backed up at any rate. i haven't checked to see if there's a windows-native version or not.