As described in this hint, the split utility is a highly-useful tool for splitting a large file into pieces. Recently, however, I discovered that attempting to split a file into pieces 2GB in size or larger fails with an error:
This is caused by Apple shipping an older copy of split that uses signed long values for the byte count, limiting it to 2047MB. One workaround is to compile a new split utility that uses 64-bit integers, allowing a much larger byte count. Newer source code may be found on the NetBSD web site.
This may be kind of obscure, but if you're into Perl and trying to install modules via CPAN, specifically the DBI and DBD ones, things can get hairy. I spent a few hours trying to install these modules from source and from the CPAN perl shell. Neither worked, but gave frustrating errors that were impossible to track down. After a bunch of testing, I found the solution:
In a Terminal window, type fink list dbi or fink list dbd and you will see a list of available packages.
Now type fink install dbi-pm586 (that's the default DBI package for Perl 5.8.6)
Here's the tricky part. Fink installs the modules in /sw -> lib -> perl5 -> 5.8.6 -> darwin-thread-multi-2level, which is not in your normal @INC when you run perl scripts. I'm sure you could recompile Perl to include Fink's installation directory, or move the files to the right places, but instead I found a different way. In your perl script, simply put:
use lib '/sw/lib/perl5/5.8.6/darwin-thread-multi-2level';
and Perl will look in the correct directory.
This solution eliminates the annoying CPAN - compiler - make errors for me.
[robg adds: A much older hint refers to some compiler issues with gcc3 that prevent the DBI and DBD modules from compiling. The hint claims that simply changing the compiler to gcc2 solves those issues. I'm not sure if that's still relevant to the issue in this hint, but it might be something to look at. In any event, using the Fink versions appears to be a valid workaround.]
The tar utility in 10.4 is great in that it now supports copying of resource forks. I've used hfstar for a long while, and thought I'd switch this weekend to using the 10.4 version on a newly-acquired Intel iMac. The machine has its internal disk partitioned into three pieces: OS, Users and Media. On the OS and Media partitions, /usr/bin/tar worked fine and preserved resource forks on the backups.
On the Users partition, however, it successfully created tar archives, but without resource forks being preserved. It also generated errors of the form:
$ tar -cvf test.tar Test
tar: /tmp/tar.md.GPzLI9: Cannot stat: No such file or directory
tar: /tmp/tar.md.ayDXd5: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors
I did some googling on the issue, and could find very little other than some mention that it might be due to disk errors. So I ran a repair with Disk Utility, and it couldn't find anything wrong with the disk.
NOTE: this hint was developed under XCode2.4 and Mac OS X 10.4.8, it probably won't work on pre-Tiger systems.
We just got some MacBook Pros, and our lab is now mixed PPC and i386. We rely on a number of command line programs developed over the years and ported from various BSD-flavored systems. We decided to convert all these programs and libraries to universal binaries so that all the systems can copy them from a single installed prototype. However, the process of converting bunches of makefiles to universal binaries seemed to be kind of a pain, so I wrote a simple program to convert cc(1) to produce them by default. The source can be downloaded here.
This tiny program works by inserting three critical arguments into a standard cc command and then calling gcc with the revised argument vector. It is installed as a separate command called ccub, and optionally it can replace cc so that universal binaries are the default. There's more explanation in the tarball.
Mac OS X users who attempt to access a Subversion repository hosted on an SMB file server (such as a Windows server) quickly run into difficulty. This is due to how file operations are handled by Mac OS X when dealing with an SMB filesystem. The details can be found in this blog entry: Subversion, Mac OS X, and SMB. A patch for obtaining usable support for repositories hosted on an SMB file server can be found here. Finally, here are instructions on how to install Subversion with the patch applied.
While this will obviously not be as reliable as accessing a repository hosted on a real Subversion server, it does suffice for situations that involve a small number of users.
There are a lot of ways to author a DVD, like iDVD or SmallDVD. They are quite powerful and require a lot of interaction from the user. But how could we author a DVD without using a GUI? The objective of this hint is create a simple DVD to view some DivX (or other formats) in a DVD player. I do not need fancy things, like menus. I just want a DVD like a VHS tape, one movie is after the other, but more comfortable to use because we can access the movies using the Chapter button on the DVD player's remote.
The best part of not using a GUI is that you just spend one minute preparing everything, and then there is no more interaction until the end of the proccess. I prepare my DVDs during the night. That is great because the Mac could need several hours to prepare the DVD. To use this hint, you must be comfortable using Terminal (only a little, you don't need to be an expert -- I am not) and installing applications using Fink.
First you should install DVDauthor and mkisofs, both available in Fink, and MPlayer.
I was initially using SSH shortcuts as described in this hint, but after using the hint for a while, I noticed some problems:
There doesn't seem to be a way to create .inetloc files other than drag and drop. You can create a file that has the correct data fork and the Finder will handle it, but without the resource fork portion that the Finder itself creates upon drop, other programs (like Path Finder) won't open the file properly. I keep a list of hosts I regularly connect to in ~/.hosts, and I wanted to be able to use a script to create the .inetloc files based on .hosts, rather than have to drag and drop the location to the Finder for every single one. (That would be fine if it was one time only, but the list changes.)
For whatever reason (I suspect a bug in Terminal.app), if you open a new Terminal window via one of these ssh://host shortcuts, .term files will not open until you relaunch the Terminal. They'll make Terminal the active pplication, but nothing happens.
For whatever reason (another bug?), windows opened via the ssh://host .inetloc files don't show the "Command key" (⌘1, ⌘2, etc) in the window's title bar. (Although the command keys still work if you're a good guesser.)
The ssh:// addresses don't allow for many SSH options. From what I can tell, username is the only thing you can change, and even that can be problematic as robg pointed out.
I had to tell QuickSilver where to look for the .inetloc files, but I noticed that it already indexes ~/Library/Application Support/Terminal/ by default.
The solution was to use .term files for everything instead of ssh:// shortcuts.
I often rename files immediately after downloading and stick them in a folder somewhere for later reference. But I also often forget what I've already downloaded. So I wrote this bash shell script to use Spotlight to find files that match a file's name or its size and kind.
Usage, in Terminal:
For example, for a file called 0.pdf, output might look like this (line breaks added for a narrower display):
Possible matches based on filename:
Possible matches based on size and kind:
So it turns out I just downloaded a file that I already have four copies of under different names and locations.
I've set this up as a command in OnMyCommand. For this to work, it requires you to have put the shell script in a folder that's included in your $PATH. Here's the OnMyCommand command (assuming you are using OMCEdit):
A previous hint suggested a way to add your own artwork to iTunes' art database, so it's not stored inside each individual track. However, it didn't seem to work all of the time.
So I wrote some Python scripts to do this. They intercept iTunes' requests for cover art, and allow you to send your own images. They're sent back to iTunes in the same format that the iTMS sends them, so they work just as well. The script includes a basic browser-based interface for selecting images for each album. You can enter your own URLs, or search Amazon for covers.
You'll need to have Python 2.4.3 installed, and not be afraid to edit your hosts file. If you want to search Amazon, you'll also need to sign up for a free Amazon Web Services account. Download the scripts (40KB) (MacOSXHints mirror), and you'll find the instructions in the included README.txt file.
[robg adds: I haven't tested this one yet, and note that you'll have to disable your local webserver in order for this to work, as well as modify the hosts file.]
This is all on one line, of course (but with the backslash, copy and paste should work). Name the file curlsms, change the permission to allow execution (chmod a+x curlsms), and place it in a folder of your liking. To run it, navigate to its directory and the usage is:
./curlsms sender 1234567890 Message text
Replace sender with your name, 1234567890 with the recipient's phone number, and Message text with the message you'd like to send. I'm sure some of you super-creative users could integrate this into AppleScripts and do some really neat Mail rules and all that business with such a script.
Keep in mind there is a character limit on many of the text fields of Cingular's web form. This script does not make the necessary checks, and those additions would be pretty nice. Also, you can check out the form and add scheduling features and priority features as well. I am a bash shell idiot, and it would take me years to make this robust.
[robg adds: This worked; the only change I made was to add > /dev/null to the end, after the URL. I did this to route the output of the curl command into never-never land -- otherwise you'll see the URL's source code go scrolling past. With the change, you'll only see a summary of the transmission time at the end.]