Submit Hint Search The Forums LinksStatsPollsHeadlinesRSS
14,000 hints and counting!

Experiment with GIMP's new single window mode UNIX
Are you curious, like me, about the new Single Window Mode (most excellent; see this article at Ars for more details) available in the newest unstable 2.7.x GIMP releases? Well, sadly, the final and stable GIMP 2.8 release won't come out before the end of this year, and there are still no experimental 2.7.x binary releases available for Mac OS X (via X11). One could always try to compile everything from source, but that might be quite complicated and time-consuming.

So, let's look at another, definitely easier way of running GIMP 2.7.x on Mac OS X: not (semi) natively through X11, but through virtualization. First of all, we need a virtual machine with the latest Ubuntu 9.10 (Karmic), or even 10.04 (Lucid, in Alpha at this time) installed: you can create a 32- or 64-bit Ubuntu VM in VMware Fusion, Parallels Desktop or Sun VirtualBox (which is free). On the Mac OS X integration front, Fusion currently offers the best tools for Linux VMs (including an integrated Unix/OS X windows mode (Unity) that really works).

After having set up the Ubuntu VM (also with the latest updates, etc.), we can begin with the essence of this hint.
read more (141 words)   Post a comment  •  Comments (15)  
  • Currently 2.14 / 5
  You rated: 2 / 5 (21 votes cast)
[13,478 views]  View Printable Version
Clean .DS_Store, .Trash, and ._resources files prior to copy UNIX
Frequently we need to clean a directory before zipping it or copying it to an external USB drive to be used by Windows or Linux users.

Apple Finder has the custom of populating directories with those unavoidable .DS_Store files, volumes with .Trashes, and some files (especially pictures) with ._resources. The following interactive script will safely remove these files prior to copying.
# bash script to clean (delete) Finder .DS_Store, .Trashes and ._resources
# Use 
# juanfc 2010-03-06

if [ $# != 1 ]
  echo "ERROR:  use\n\t`basename $0` dirtoclean"
  exit 1

res=`find "$@" \( -name ".DS_Store" -or -name ".Trashes" -or -name "._*" \) -print`

if [[ -z $res ]]; then
  echo "nothing to delete"
  exit 0
  echo "Going to delete:"
  echo $res
read -p "Ok (yYsS1)?" ok

case $ok in
  [yYsS1] )
    find "$@" \( -name ".DS_Store" -or -name ".Trashes" -or -name "._*" \) -exec rm -rf "{}" \; -prune ;;
  * )
    echo "aborted."

exit 0
[robg adds: I haven't tested this one. This older hint contained a one-liner to clean up just the .DS_Store files, and a commenter there notes the existence (beginning in 10.5) of the dot_clean command to simplify the task.]
  Post a comment  •  Comments (12)  
  • Currently 2.32 / 5
  You rated: 5 / 5 (19 votes cast)
[26,572 views]  View Printable Version
Copy a file to all subfolders of a folder with one command UNIX
Over the weekend, I was installing a demo of some web-based help desk software, and this particular package used encrypted PHP files. With my hosting company, I have to place a php.ini file in every directory that contains encrypted files, or the site won't work. This particular package had hundreds of directories, buried in folders and subfolders, and I was dreading the task of copying this one file into each of them.

A tip from my friend James pointed me to the solution, in the form of Smiling Dragon's reply in this thread over on the Unix/Linux forums. Assuming the file exists in the parent directory, then this command does the trick:
find . -type d -exec cp php.ini {}/ \;
This worked perfectly, and incredibly quickly -- at first I thought it had failed, but a quick sample of a deeply-buried folder showed that the file was now in each and every sub-directory on the site. Note that if something goes wrong when you try this, you may create quite a mess to clean up; I tried it first on my local machine, just to be sure it worked. Only then did I run the command on my web server.
  Post a comment  •  Comments (8)  
  • Currently 3.32 / 5
  You rated: 1 / 5 (19 votes cast)
[17,037 views]  View Printable Version
How to use 'cp' as a simple but reliable backup tool UNIX
While looking for the perfect product to keep my photos safe, I discovered that sometimes simple is best. My requirements were simple: ensure that all my digital photos, stored on a locally attached USB drive, were duplicated to another drive attached to my AirPort Extreme. My photos are in RAW format (specifically DNG files) and will never change, so I only need to concern myself with new files.

I checked out numerous commercial and free products for backup, synchronizing and more, and nothing quite fit the bill. Whilst rsync could probably do the job, I couldn't get my head around the terminology to be sure I wasn't risking the original files. Then I discovered the solution. So mind-bogglingly simple, and no third-party software required. In Terminal, I run this command:
cp -npRv "/Volumes/LocalUSB/Photos/" "/Volumes/RemoteUSB/Photos/"
Yes, it is the standard Unix copy (cp) command with a few options:
  • n - Do not overwrite an existing file
  • p - Preserves attributes, including resource forks
  • R - When the source file is a directory and the path ends in with a slash (/) then the entire contents of the directory are copied recursively
  • v - Causes files to be listed when copied
The n and R ensure that all new files are copied from the directory tree. Files already there are not re-copied. In short, a quick and efficient means of getting just the new photos copied over.
  Post a comment  •  Comments (47)  
  • Currently 2.35 / 5
  You rated: 3 / 5 (37 votes cast)
[39,264 views]  View Printable Version
Search Address Book from Terminal UNIX
I have a need (like for Mutt's query_command) to search for email addresses from OS X's Address, but from the command line. After trying various hints, code, etc., and getting frustrated, I found that it's just stored in sqlite3. Everything is already built into OS X, you just have to figure out the SQL structure. I've only tested this on 10.6, but I imagine it will either work as is, or be easily adapted to work, on 10.5.

I created a shell script called in /usr/local/bin with these contents:
sqlite3 -separator '    ' ~/Library/Application\ Support/AddressBook/AddressBook-v22.abcddb "select e.ZADDRESSNORMALIZED,p.ZFIRSTNAME,p.ZLASTNAME,p.ZORGANIZATION from ZABCDRECORD as p,ZABCDEMAILADDRESS as e WHERE e.ZOWNER = p.Z_PK;" | grep $1
Note: The -separator ' ' bit is a Tab character (use Control-V, Control-I) to insert one in bash. You can make it a few spaces or whatever you want, but if you are using this as Mutt's query_command, then it must be a Tab character.

The above code dumps more than one email address per contact (if there is more than one), and one email address per line -- so this isn't useful for importing into anything else. To mess about with the SQL, use sqlite3 and try the .dump command to get a feel for what the schema is like. That's it! I hope this helps someone out there in the world.

[robg adds: If you create the above shell script, remember to make it executable (chmod a+x scriptname), and call it with the name you'd like to find: Smith. We covered some other sqlite tricks for Address Book in this previous hint.]
  Post a comment  •  Comments (5)  
  • Currently 2.31 / 5
  You rated: 3 / 5 (16 votes cast)
[8,009 views]  View Printable Version
Explaining odd entries in 'df' output UNIX
The other day I was using df -h to check disk space usage, and was surprised to see some odd entries in the output:
$ cd /Volumes
$ df -h
/dev/disk2s3    75Gi   51Gi   24Gi    69%    /Volumes/datafiles
map -hosts       0Bi    0Bi    0Bi   100%    /net
map auto_home    0Bi    0Bi    0Bi   100%    /home
I had no idea what the map entries were there for; they don't show if you simply do ls /Volumes. After inquiring around, a friend managed to find the answer for me: they're related to the autofs feature in 10.5 and 10.6. You can read all about it, if you care, in this Apple PDF. I was more curious than alarmed, as I hadn't remembered seeing those entries before (I don't use df -h folder all that often).
  Post a comment  •  Comments (12)  
  • Currently 2.78 / 5
  • 1
  • 2
  • 3
  • 4
  • 5
  (18 votes cast)
[10,819 views]  View Printable Version
Create a clean Terminal for a serial line connection UNIX
I looked everywhere on the net to find something that I could afford with which to connect my USB serial port to a Sun machine. I couldn't install Fink or MacPorts, and screen was garbling all of the text for the Solaris install. Finally, I started looking in general UNIX support, which led me to the cu utility.

Use the cu command to get a (more-or-less) clear line. It's part of OSX's BSD heritage, and was originally used to allow UUCP batches to dial modems and link with each other. Here's how to use set up a connection:
sudo cu -s [bitrate] --nostop -l /dev/cu.[serialdevice]
This could also be used with /dev/tty.[serialdevice], I believe, though I have not tried it.

Caveat: If you use the ~ key, make sure you type it twice. Also, the --nostop parameter disables cu interpreting XON/XOFF software flow control. If you don't use it, and the system gets a Control-S character for whatever reason, you will need to type Control-Q to get the output moving again. If you're using the Terminal, you know how to use man(1), so read the man page for cu(1).
  Post a comment  •  Comments (5)  
  • Currently 2.39 / 5
  You rated: 3 / 5 (18 votes cast)
[10,094 views]  View Printable Version
Add timestamps to Unix commands that run at intervals UNIX
For a little project I was recently working on, I wanted to track some memory usage statistics over time. I figured that the Unix command vm_stat would be a good way to do that, as it includes the four basic memory usage types (free, active, inactive, and wired). My intent was to put this in a shell script that would run the command at a specified interval, dumping the output to a text file each time.

However, the basic output of vm_stat is less than ideal for dumping to a file via a shell script:
$ vm_stat
Mach Virtual Memory Statistics: (page size of 4096 bytes)
Pages free:                         486663.
Pages active:                       140886.
Pages inactive:                      92592.
Pages speculative:                  189887.
Pages wired down:                   138988.
"Translation faults":              1221453.
Pages copy-on-write:                 63521.
Pages zero filled:                  709169.
Pages reactivated:                       1.
Pageins:                             59295.
Pageouts:                                0.
Object cache: 28 hits of 34093 lookups (0% hit rate)
Parsing the above in a spreadsheet app would require some serious text manipulation. A brief glimpse at the man page for vm_stat showed another usage option: vm_stat interval, where interval is how often (in seconds) vm_stat should measure memory usage. When used in this way, the output is much nicer for capture:
$ vm_stat 2
Mach Virtual Memory Statistics: (page size of 4096 bytes, cache hits 0%)
  free active   spec inactive   wire   faults     copy    0fill reactive  pageins  pageout
486024 139635 191315    92771 138964  1224506    63581   711642        1    59301        0 
486194 139900 191372    92799 138445      392        0      366        0       10        0 
485056 139830 191381    93499 138966      100        0       36        0       42        0 
That was nearly perfect, though I only needed the first 36 characters (the end of the wire column). But I had a problem: I needed to include a timestamp on each row of the output, so I could tie memory usage back to some activities I was starting and stopping at specific times. Read on to see how this is done...
read more (298 words)   Post a comment  •  Comments (20)  
  • Currently 2.39 / 5
  You rated: 4 / 5 (18 votes cast)
[9,198 views]  View Printable Version
Use Tab-completion for SSH 'host' aliases UNIX
Do you use just use 'host' aliases defined in ~/.ssh/config for all of your SSH'ing? If so, this hint lets you easily tab-complete those aliases in Bash without needing bash-completion.

It's quick. It's dirty. But it works.

Put the following in your user's .profile (or Bash-specific initialization file):
complete -o default -o nospace -W "$(/usr/bin/env ruby -ne 'puts $_.split(/[,\s]+/)[1..-1].reject{|host| host.match(/\*|\?/)} if $_.match(/^\s*Host\s+/);' < $HOME/.ssh/config)" scp sftp ssh
Now any time you're using scp, sftp, or ssh, you can just type part of the alias. For instance, on my computer, I type ssh se[TAB], and it completes to ssh server.

[robg adds: I haven't tested this one.]
  Post a comment  •  Comments (6)  
  • Currently 3.19 / 5
  You rated: 3 / 5 (26 votes cast)
[15,761 views]  View Printable Version
Geolocate a number of IP addresses via shell script UNIX
From time to time I like to know (when reviewing log files) where an IP address is geographically located. The following shell script will take a list of IP address in a file named list, and look up their geographic location. Here's the code:
echo "Put the IPs you want to lookup into a file named list."

for i in `cat list`
lynx -dump $url$i > tmp
cat tmp | sed -n '/Host Name/,/Postal code/p'
rm tmp
jamesk@HOME~/Desktop $ echo > list
jamesk@HOME~/Desktop $ geo
Put the IPs you want to lookup into a file named list.
      Host Name:
     IP Address:
        Country: [14]United States united states
   Country code: US (USA)
         Region: [15]Oregon
           City: Eugene
    Postal code:
[robg adds: Note that this hint requires the lynx text-only web browser, which you can install via Fink or MacPorts or probably many other methods. I'm sure there's probably a way to do this without lynx, but I'll leave that to those who actually know what they're doing...]
  Post a comment  •  Comments (13)  
  • Currently 1.95 / 5
  You rated: 3 / 5 (19 votes cast)
[17,210 views]  View Printable Version