May 17, '11 07:30:00AM • Contributed by: robleach
Whenever I do this, my system crawls to a halt as TextEdit's virtual memory size bloats and the swap space goes crazy with page swaps. It takes minutes for me to go around saving open files before I can force-quit TextEdit. Here's a way I've found that can avoid these headache-inducing periods of no work getting done.
This hint, unfortunately, only works if you happen to (like me) use the C shell as your shell of preference, because there's no way of referencing arguments in a bash alias. Perhaps someone can post a bash equivalent in the comments. Also, there are many valid reasons to avoid aliasing actual commands (though I do so all the time without any problem) so you may want to change the alias name to something other than 'open.'
Finally, I used perl because I'm most familiar with it, though I'll admit there may be a more efficient and simple way of doing it. Again, I'll defer to improvements in the comments.
Basically, all I did was create an alias for the open command which first checks the file sizes before performing the command. If one or more of the files supplied to open (with -e as the first argument) is larger than 200M, it prints a warning instead of opening the file(s). Thus, to get around the warning, you would have to escape the alias (e.g. \open -e myfiles). It's a nice check to make sure I'm not opening a very huge file.
Here it is:
alias open 'perl -e "@x=split(/\s+/,qq(\!*));(@x)[0] eq qq(-e) ? (scalar(grep{@y=(split(/\n/));(stat((@x)[(@y)[0]]))[7]>200000000}(1..(scalar(@x)-1)))?print STDERR (qq(WARNING: One or more of your files is awfully big to open in text edit: ),(stat(qq((@x)[1])))[7],qq(\n)):exec(qq(open \!*))) : exec(qq(open \!*))"'
[crarko adds: I haven't tested this one.]
