Submit Hint Search The Forums LinksStatsPollsHeadlinesRSS
14,000 hints and counting!

Checking Very Large Time Machine Volumes Storage Devices
Checking very large disk volumes with Disk Utility, especially Time Machine backup disks, can be painfully slow, taking many hours to complete, if it completes at all. This Terminal script vastly speeds up checking big volumes.

The tool behind Disk Utility's volume checking is fsck_hfs, which can also be run from the command line. The key to fast volume checking is a sufficiently large cache for the volume structures in memory, which Disk Utility obviously doesn't supply. This example uses 2.2 GB cache in RAM:

sudo fsck_hfs -f -c 2200m /dev/disk2

For a full 1TB Time Machine backup disk with many millions of files, this completes in about 10 minutes. A nice side effect is that this also puts less stress on the disk, as most reads are served from the cache.

Adding the little shell script below to your command line tools can make your life a lot easier. It takes the volume name as the single argument. The drive is unmounted during the check and remounted when finished.
#!/bin/bash
# Run a fast volume check on large Time Machine backup disks
export VOLUME=/Volumes/$1
echo "Determining disk device of $VOLUME"
export DISK=`diskutil info $VOLUME | sed -n '/ Device Node\:/s/.* \(\/dev\/disk.*\).*/\1/p'`
if [ "$DISK" = "" ]; then
  echo "Unable to determine device name!"
  exit 1
fi
echo "Performing filesystem check on $DISK"
diskutil unmountDisk $DISK
sudo fsck_hfs -f -c 2200m $DISK
diskutil mountDisk $DISK
I hope this will be useful for you.

[crarko adds: I haven't tried this, but the script looks sound.]
    •    
  • Currently 3.92 / 5
  You rated: 4 / 5 (13 votes cast)
 
[14,941 views]  

Checking Very Large Time Machine Volumes | 24 comments | Create New Account
Click here to return to the 'Checking Very Large Time Machine Volumes' hint
The following comments are owned by whoever posted them. This site is not responsible for what they say.
Checking Very Large Time Machine Volumes
Authored by: systemsboy on Sep 06, '11 07:47:25AM

I tried this and it seems to work well. Nicely done!

-systemsboy



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: david-bo on Sep 06, '11 08:24:02AM

I need a similar hint for Disk Warrior. The problem with DW is that it has a Carbon GUI which limits it to address around 2.6GB of memory. Even moderately large Time Machine volumes exceeds this (that is, the file allocation table etc is too big to be kept in memory).

DW is split into at least two parts, one GUI and one command line utility. The GUI seems to create a file/send commands to the CLI depending on the options selected by the user in the GUI. The command line utility is 64 bit and does not have the problem with lack of addressable memory.

Does anyone know how to use DW from the CLI? That is, what options are available and how are they used?

---
link



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: drudus on Sep 06, '11 09:28:12AM

Are you sure Disk Warrior has a command line interface?
Does it have a manual -> 'man diskwarrior'

I can't see any mention of it on the Alsoft site ot the manual perhaps a message to support would get the answer, if you find out post it here since it would be great to get some extra speed from such a great disk tool.

I suspect any binaries will be inside the Disk Warrior Application bundle so perhaps you need to call it directly with a -h or --help option to see an arguments it supports.



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: david-bo on Sep 06, '11 12:36:28PM

Yes, I dug into the bundle and you could launch some of the binaries in there but they quit immediately because they needed input. This is completely undocumented and I did a quick strings on the different binaries but couldn't find anything useful. However, a combination of strings and decompilation probably would reveal some interesting stuff.

No man page or help switch.

Try it yourself.

---
link



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: drudus on Sep 06, '11 12:53:38PM

I tried with Disk Warrior 4, the DiskWarrior binary launched the the GUI app from Terminal. There seems to be some helper daemons & server binaries too, but I don't know how they would be run correctly without the app setting them up correctly.

Contact the developer, it would be easiest if they checked the disk size, available RAM & use appropriate values when using the GUI.

The same RAM usage argument may not even be relevant depending on how the scanner/ repair processes run.



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: david-bo on Sep 07, '11 12:39:26AM

I have talked to Alsoft about this and it is not supported - which I think sucks. They really need to update to a more modern GUI and/or support memory intensive operations at least from the command line.

---
link



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: steresi on Sep 06, '11 09:40:34AM

If my machine freezes up, I boot into single user mode and run "fsck -fy" on it to check for and repair any disk damage.

Should I also run fsck_hfs? Or are fsck and fsck_hfs practically the same thing?



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: fracai on Sep 07, '11 06:41:41AM

The 'fsck' command is just a proxy to a format specific command such as 'fsck_hfs'. There's no need to run both.

---
i am jack's amusing sig file



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: earthsaver on Sep 06, '11 10:29:04AM

Can someone please explain what I should do with this shell script and how to use it as an action on a particular volume? Should I paste it in an Automator workflow that has specified or asked for Finder items (the volume of desire)?

---
- Ben Rosenthal
MacBook Pro 2.8GHz - Lion
iPad 3G 32GB



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: systemsboy on Sep 06, '11 11:15:44AM

To make this into a script, copy and paste the script text into a plain text file. Save the file. Make the file executable:
chmod 755 /path/to/script

The easiest way to run the script is to simply drop the path into the Terminal and supply the Time Machine volume's name:
/path/to/script [VolumeName]

This will run the script on the specified volume. The script checks large Time Machine volumes using a larger-than-normal buffer to work around the fact that Time Machine disks have such a huge store of files.

-systemsboy



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: earthsaver on Sep 06, '11 12:02:38PM

Thanks! Glad to learn it's that straightforward. It follows, then, that I can put a copy of the script in my /usr/bin folder and run it without indicating its full path. I called the script fastfsck. It works!

---
- Ben Rosenthal
MacBook Pro 2.8GHz - Lion
iPad 3G 32GB



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: steresi on Sep 06, '11 10:35:24AM

Is there a practical difference between "fsck" and "fsck_hfs"? I use the former to check for discorruption whenever my machine locks up and I have to restart.



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: fracai on Sep 07, '11 06:43:47AM

The 'fsck' command is just a proxy to a format specific command such as 'fsck_hfs'. There's no need to run both.

---
i am jack's amusing sig file



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: auntchilada on Sep 06, '11 05:41:12PM

holy crizappy! this is ten kinds of awesome! thanx!



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: fmonsour on Sep 07, '11 07:01:02AM

I tried this, including the directions provided by systemsboy, and get an "Unable to determine device name!" error. My Time Machine drive, named "Time Machine", is one of three partitions on an external drive.

Any ideas on what I'm doing wrong? I'll admit to being an OSX shell newbie (or whatever's one level down from that), and appreciate any guidance provided.



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: pappy on Sep 07, '11 06:57:08PM

Missing something

I saved under fastfsck, when I run I get the error

$ fastfsck /Volumes/Drobo2
Determining disk device of /Volumes//Volumes/Drobo2
Unable to determine device name!

Thanks



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: fracai on Sep 09, '11 07:08:55AM

The script already includes the "/Volumes/" part. You should either remove that from the script, or not include that part of the path when invoking the script.

---
i am jack's amusing sig file



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: llee on Sep 08, '11 09:41:36AM

I think the problems that people are having are related to spaces in the volume name for the disks they want to check. I tried various ways of quoting the target volume's name to escape the spaces when passing it to an executable made from your script, and i couldn't find one that worked. i finally had to just take the spaces out of the volume name. That's a little undesirable for dealing with a Time Machine backup, because the spaces are put there by default when a Time Machine backup volume is configured, i think. So how can I escape the spaces in my big volume's name so that your script can work? Thanks.



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: PatrickS on Sep 12, '11 12:37:09AM

Enclosing the first and third occurence of $VOLUME in " " (double quotes) should do the trick



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: llee on Sep 12, '11 06:25:48AM

Thanks. Because I could only find 2 occurrences of "$VOLUME" in the script as submitted, I'd have to say I'm somewhat confused. Would you please elaborate? Thanks again.

Edited on Sep 12, '11 06:28:39AM by llee



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: davidmorr on May 27, '13 05:13:26AM

The first export line needs to be:

export VOLUME="/Volumes/$1"

Edited on May 27, '13 05:14:14AM by davidmorr



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: fmlogue on Sep 08, '11 01:56:43PM

Thanks for this hint. However a comment and a request for further help.

First, since my Time Machine is on a Partition, I found that I had to use <sudo fsck_hfs -f -c 2200m /dev/disk2s2> (note I added s2 to disk2)

Second, the result was: The volume Time Machine was found corrupt and needs to be repaired.
I believe this is related to the first message put up: "Verifying volume when it is mounted with write access.
Journal need to be replayed but volume is read-only"

Is the volume in need of repair or is that only an artifact of it being read-only?
And how do I allow fsck_hfs write access?

TIA



[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: sjk on Sep 26, '11 01:12:08AM
Is the volume in need of repair or is that only an artifact of it being read-only?
Likely it needs repair. Mine checks okay when read-only.
And how do I allow fsck_hfs write access?
Unmount the volume before checking it.

[ Reply to This | # ]
Checking Very Large Time Machine Volumes
Authored by: pappy on Sep 09, '11 07:49:12PM

Thanks



[ Reply to This | # ]