Submit Hint Search The Forums LinksStatsPollsHeadlinesRSS
14,000 hints and counting!

One step User folder backup UNIX
This is just a couple of tips from the MacOS X Solutions Guidebook combined together in a way that some friends and I have found rather useful to quickly backup our home folder (requires Developer Tools Installation).

Read the rest of the article for the how-to...

1. If neccessary, make the ~/Library/init/tcsh directory:
mkdir ~/Library/init

mkdir ~/Library/init/tcsh
Then, change to that directory
cd ~/Library/init/tcsh
2. Using Pico, open "aliases.mine"
pico aliases.mine
3. Type in (with quotation marks)
alias backup "sudo ditto -v -rsrcFork ~ blah"
... where "blah" is the location you would like to store the backup. On my computer, it's:
alias backup "sudo ditto -v -rsrcFork ~ /Volumes/Storage Space/Backup"
4. Ctrl-x to exit Pico, save the buffer when prompted.

5. Open a new terminal window, type "backup" (without quotes) at the prompt. It will copy your user folder to the specified place, invisible files and all. It will overwrite stuff, so exercise the appropriate caution.

Now, anytime you want to make a backup of your user folder, just open up a terminal window and type backup.
  • Currently 3.00 / 5
  You rated: 3 / 5 (2 votes cast)

One step User folder backup | 20 comments | Create New Account
Click here to return to the 'One step User folder backup' hint
The following comments are owned by whoever posted them. This site is not responsible for what they say.
Authored by: Anonymous on Dec 18, '01 11:45:30PM

/Volumes/Storage Space/Backup is similar to the "iTunes hosed my drive" bug, since you're overwriting files there.

Make sure you use /Volumes/Storage Space/Backup or put it in quotes to have the desired effect. (It's not letting me put a backslash after the word "Storage" for some reason, but that's what I mean.)

Unless ditto works differently, of course...I've never used it before. In that case, just ignore me. :-)

[ Reply to This | # ]
rsync is another option
Authored by: randydarden on Dec 19, '01 12:18:43AM
I can't remember where I found out about rsync - it may very well have been here on Anyway, it will let you backup a collection of files but won't bother copying files that haven't changed since your last backup. It's also easy to do two-way backups (aka "synchronize"). However I don't understand its impact on Mac files, especially those with resource forks.. That may not be an issue, but I've been using it in OS X to sync my work to a Windows drive in the office without any problems.

Check it out at:

Or type man rsync into your terminal.

[ Reply to This | # ]

Only half the solution
Authored by: Chas on Dec 19, '01 01:43:11AM

Have you actually tried to restore files from this backup? Your backup will be useless unless the restore works correctly. I think you'd be better off with unix tar, that's what tar was designed for.

[ Reply to This | # ]
Only half the solution
Authored by: patashnik on Dec 19, '01 02:42:42AM

tar wreaks havoc on your precious resource forks, which you really don't want; my suggestion is to look for hfspax (use google), a patched version of pax which knows about resource forks and Finder information. I use it to backup my harddisk to my iPod, for example (my backup-script is available for anyone who's interested)

[ Reply to This | # ]
Only half the solution
Authored by: el bid on Dec 19, '01 03:05:54AM
tar wreaks havoc on your precious resource forks, which you really don't want

If you're saying you don't really want resource forks, I'd agree. But I don't think this is what you mean.

tar doesn't wreak havoc with HFS+ strangenesses, it just ignores them, which of course creates a problem with software that relies on the multistream cuteness of HFS+ and its type/owner eccentricities.

Thanks for alerting us to hfspax. I just felt the need to step in and defend tar, which in the GNU version is IMHO the boss when it comes to cross-platform backup.

el bid

[ Reply to This | # ]
Authored by: thatch on Dec 19, '01 02:14:59AM

I couldn't get this to work because ditto keeps on crashing with this:

Exception: EXC_BAD_ACCESS (0x0001)
Codes: KERN_INVALID_ADDRESS (0x0001) at 0xbff7fe70

Termial reports a segmentation fault.

I have used ditto to do this manually before without any problem. But now when I try the full command in terminal, it also crashes like with the alias backup method described by this tip.

% sudo ditto -vR rsrcFork 'backup source' 'backup destination'

I have recently installed the latest dev tools, FWIW.

I also read recently that ditto doesn't like a locked file and will skip everything thereafter and go on to the next directory. Some have said that it can actually just puke on a locked file. How do I find out if I have acquired a locked file? Is there a way with locate or ls to show locked files?

Anybody have an idea of what has happened to ditto for me? TIA

[ Reply to This | # ]
As Root?
Authored by: el bid on Dec 19, '01 03:09:01AM

Does it strike anyone else as completely daft that you need to run ditto as root? Just to back up your own home directory?

But you do -- see "man root" (where at least it's admitted to be a BUG).

el bid

[ Reply to This | # ]
As Root?
Authored by: robh on Dec 19, '01 11:53:18AM

I'm hoping that the "as root" info is now out of date. I have run some ditto jobs as non-root, but I tend to use it to backup privileged files so use sudo by default.

[ Reply to This | # ]
Ditto not optimal for backup
Authored by: el bid on Dec 19, '01 03:22:51AM

I'm really uncomfortable about ditto -- it seems to be an under-maintained utility with the kind of bugs you just don't need in a backup tool. I'm also uncomfortable about the whole HFS+ thing, which doesn't sit at all well in the UNIX landscape (which is where we are these days). But I'm aware that this is an ongoing debate among the Mac community, and I'm not trying to fan any flames here.

Here's what I currently think is the best solution. Use the GUI Disk Copy tool for backup. Create a suitably sized empty disk image, copy your HFS+ bouillabaisse into it. You now have a bog standard file which you can back up to tape or whatever in the conventional way. It will probably compress nicely.

The interactivity required by this technique is a downer, but presumably a suitable Applescript could be constructed.

Anyone got a better idea?

el bid

[ Reply to This | # ]
Disk Copy not optimal for scripting
Authored by: sjonke on Dec 19, '01 03:13:15PM

Scripting Disk Copy would be a great idea. Only one problem - Disk Copy isn't scriptable. Neither are many other standard OS X applications. If Apple is going to trumpet scripting (as they should) then they need to get their butts in gear and make more standard OS X applications scriptable.

[ Reply to This | # ]
Disk Copy not optimal for scripting
Authored by: el bid on Dec 19, '01 03:49:27PM

But see man hdiutil...

Looks pretty good if it does what it says.

el bid

[ Reply to This | # ]
hfspax and anacron
Authored by: vasi on Dec 19, '01 08:31:12AM
hfspax is a nice utility to archive HFS+ directories without losing resource forks or file attributes. Anacron is available via Fink, it basically runs programs every X days like cron does, except it doesn't depend on your computer always being on at the time the program is set to run.
This will backup your home folder:
cd /Users
hfspax -wx cpio -f /backup/destination myUserName
I like it more than ditto since it puts everything in one file, so I can gzip it after. Also, it lets you filter files. Since I know that all the MP3s in my Sound folder are already burned to CDs, I don't have to back them up:
hfspax -wx cpio -s '|.*/Sound/.*.mp3||' -f /backup/destination.hfspax ~
In case you're wondering, that's just a regex to turn "*/Sound/*.mp3" into an empty string, which tells pax I don't want it to be archived. I can restore by expanding the gzip archive, then running
hfspax -r -pe -f vasi.hfspax
which preserves resource forks, permissions, everything. Another thing I find useful to do when backing up is to run
find -type f -print0 > /some/file
find -type d -print0 > /some/other/file
This gives you a nice list of all the files and directories in your home folder. You can use it to restore the entire directory structure (echo /some/file | xargs -0 mkdir -p), so even the things you filtered out of the hfspax archive still have a place to go when you restore them. You can also use it if you lose just a couple of files and only want to restore those. Just search throught the listing for the files you lost, so you know the exact names, and then use hfspax to extract just them.
I have all this done in a script that runs automatically daily via anacron (does that make it a "zero step backup"? :-). One thing to be careful of when you back up is that you don't overwrite your backup location, like the backup system in this article does; if something goes wrong in the middle of the backup, you could be SOL. Instead, move the old backup, create the new backup, then if and only if that was successful delete the old one.
Hope this helps someone!

[ Reply to This | # ]
Is anacron what you want here?
Authored by: el bid on Dec 19, '01 04:18:16PM
it basically runs programs every X days like cron does, except it doesn't depend on your computer always being on at the time the program is set to run.

Er, of course this doesn't mean that anacron will run a backup when the computer isn't on... :-)

I don't ever switch computers off, so I'm not entirely clear about the implications, but if I'm guessing right the difference with anacron is that when you switch on it will say: "Heaven's is that the time, I should have run this backup three hours ago", and start backing up right away.

But this probably isn't what you want, is it? You've probably set your backup to run at 3.00am each night when there's a good chance you won't be using the machine -- you don't want backups running while you're actually creating and changing files in your home directory.

My feeling is that cron does a better job here. If it's skipped a backup because the machine was switched off, no big deal. It'll catch up tonight. And it won't suddenly try backing up while you're creating spreadsheets just because it has a pang of guilt.

el bid

[ Reply to This | # ]
Yup, it is
Authored by: vasi on Dec 20, '01 01:33:42AM

Anacron actually runs as a cron job; I set it to run every hour. So yeah, it's fairly likely that it will run while I'm using the computer, but since I "nice" the backup command I don't notice a slowdown, just some disk churning.

I don't leave my computer on all the time, partly to save on power and partly cuz the humming bugs me when I sleep. I'd much prefer that anacron really runs the command every day, rather than having cron want to run daily but only really running every week. I've had too many unfortunate accidents to risk my data; a bit of disk churning isn't bad at all.


[ Reply to This | # ]
Yup, it is
Authored by: el bid on Dec 20, '01 03:16:21AM
but since I "nice" the backup command I don't notice a slowdown

I wasn't raising the issue of resource contention. The point I was making is that it's not a good idea in general to backup files the might be in use. You can't necessarily rely on the integrity of the backup.

el bid

[ Reply to This | # ]
hfspax and anacron
Authored by: loren_ryter on Dec 20, '01 12:41:19PM

boy, for those of us who are not unix wonks but aren't totally uncomfortable with typing in the command line and such, a simple step by step or pre-compiled script to backup folder a to location b that can be run with cronotab and retains all you'd expect (resource forks, invisible files, locked files, etc etc) would be so much appreciated....

i hope someone can put this together a new tip!

[ Reply to This | # ]
hfspax and anacron
Authored by: thatch on Dec 20, '01 05:50:12PM

Almost exactly what you want is available from Mike Bombich's OS X Tips site. Check it out. It's listed in the links section here.

[ Reply to This | # ]
aliases not showing up
Authored by: sjonke on Dec 19, '01 11:15:52AM

I tried this, but even though i created the ~/Library/init/tcsh/ directory and put an aliases.mine file in it as described, I do not get the alias. Anyone have any idea why it's not running the aliases.mine file? Note that I recently did a clean install from scratch and updated my way up to 10.1.1. Most recently I installed the Dec 2001 developers tools (did not previously have the developer tools installed.)

[ Reply to This | # ]
Never mind
Authored by: sjonke on Dec 19, '01 03:16:43PM

Now it is working. Weird.

[ Reply to This | # ]
Perl wrapper
Authored by: robh on Dec 19, '01 11:44:32AM
I've written a Perl wrapper for 'ditto' that I saved as a script called 'backup'. I use this script to maintain backups of various directories (e.g. /Users /private) on DVD-RAM. The script is here: I'd run it like this.. cd /Volumes/DVD /Users/rob/bin/backup /Users/rob rob Note that this script will REMOVE files and folders from the backup device that no longer exist in the source folder. Use at your own risk. You may need to run with 'sudo' to backup privileged directories.

[ Reply to This | # ]