Wicked Cool Shell Scripts 569
Wicked Cool Shell Scripts - 101 Scripts for Linux, Mac OS X, and UNIX Systems | |
author | Dave Taylor |
pages | 368 |
publisher | No Starch Press |
rating | 10 |
reviewer | Mary Norbury-Glaser |
ISBN | 1593270127 |
summary | 101 Scripts for Linux, Mac OS X, and UNIX Systems |
Chapters are divided into an array of topics sure to catch the attention of any UNIX based system user: The Missing Code Library, Improving on User Commands, Creating Utilities, Tweaking Unix, System Administration: Managing Users, System Administration: System Maintenance, Web and Internet Users, Webmaster Hacks, Web and Internet Administration, Internet Server Administration, Mac OS X Scripts, and Shell Script Fun and Games.
In true "cookbook" fashion, each hack is numbered and divided into The Code, How It Works, Running the Script, The Results and Hacking the Script. Throughout, the author clearly describes the syntax and functionality of each script, often with additional notes in How It Works detailing the syntax process and interesting asides. But Hacking the Script is what gives Wicked Cool Shell Scripts true value; where applicable, the author uses this section to describe script modifications to achieve a variety of alternative real world, practical results. This additional section alone easily triples the total number of scripts the reader is exposed to.
This book enables the reader to get "up close and personal" with their UNIX based system and explore the possibilities afforded by becoming intimate with the command line interface. The reader will find themselves easily propelled into the world of scripting, thanks entirely to Dave Taylor's ability to take what some might describe as a fairly dry topic and translate it into a logical and user friendly construct. Just reading through the table of contents is inspiring and intriguing; did you know you could write a script to retrieve movie info from IMDb? or track the value of your stock portfolio? or that you can use a very simple script to check spelling on your web pages?
Sysadmins and webmasters will find this book fundamentally critical to day-to-day operations; there are dozens of invaluable, customizable scripts highlighted in this book to enable professionals to save time and add simple, elegant solutions to annoying issues in their work environment. User account management, rotating log files, cron scripts, web page tweaks, apache passwords, synchronizing via ftp, etc. are all eminently useful and tweakable.
Geeky home users will discover they can use these scripts to work with files and directories, create spell-checking utilities, calculate loan payments, create summary listings of their iTunes libraries, and of course, play games. Many of the sysadmin scripts would also be of interest to the power user: analyzing disk usage, killing processes by name and backing up directories, to name a few. Both types of users will find this book inspiring and truly fun!
One of the secret pleasures of a technical book reviewer is finding those wonky bits of code that suffer from misplaced or missing punctuation, misspelled words and other basic typographic errors inherent in the book publishing process. I randomly selected many of these scripts to try out in the process of doing this review and...dang, haven't found any errata yet. But be sure to check out the errata page on Dave Taylor's web site for any that more astute readers may find (there were none, as of this writing).
Also be sure to take a closer look at Dave's shell script library, which lists additional scripts that didn't make the cut for the book. As convenient as it is to download the entire script library, I would like to stress the value of buying the book, which will provide you with invaluable instruction and guidance in understanding the syntax of the scripts and it also illustrates how making small but significant tweaks can modify the output to match your specific needs.
(A special nod of appreciation to Dave Taylor's Tintin references!)
You can purchase Wicked Cool Shell Scripts - 101 Scripts for Linux, Mac OS X, and UNIX Systems from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Re:What about us Windows users?! (Score:5, Informative)
Quick Hacks (Score:5, Informative)
If the script is not working as you want, put a
on the fist line and on the last line.You will see the exact execution path and variable expansion, very neat for debugging
Re:Sounds useful, but what languages are used? (Score:1, Informative)
http://imdb.com/robots.txt (Score:5, Informative)
Re:Quick Hacks (Score:1, Informative)
echo works well for the first line too
Re:Quick Hacks (Score:1, Informative)
: THIS LINE DOES FOO
the above line does nothing and is only printed if set -x is on...
Re:RTFT (Score:3, Informative)
I RTFT. Shell scripts can be BASH, CSH, TSH, SH, KSH, the list goes on. That's if you're assuming by "shell" they don't mean all interpreted languages -- I've seen a number of Perl / Python / Whatever scripts put in the general category of shell scripts before.
I am also curious what tools are assumed to be available to the user. There are a lot of programs available that are standard (i.e. wget), but is the author assuming a standard unix distribution, or does he say "if you have ___"?
Re:Quick Hacks (Score:5, Informative)
sh -x scriptname
Re:shell scripts vs. programming languages... (Score:2, Informative)
http://www.python.org
Re:Why shell? (Score:5, Informative)
(i) Many people, like myself, don't know perl, and don't see the point in learning when shell scripts are perfectly adequate for their purposes.
(ii) Sometimes it's just easier. viz. this quote [bash.org] from bash.org:
Re:101 Prompts? (Score:5, Informative)
I don't know if such a book (or chapter in a book) exists, but here are some links:
Have fun...
Re:Sounds useful, but what languages are used? (Score:2, Informative)
Re:Webmasters?? (Score:2, Informative)
Re:101 Prompts? (Score:3, Informative)
Amazing how much free information you can get when you look for it.
Not ubiquitous (Score:1, Informative)
For instance, FreeBSD recently took perl out of the base package. And perl wouldn't be my candidate for inclusion on a boot floppy.
Re:What about us Windows users?! (Score:5, Informative)
Cygwin [cygwin.com] is your friend. For just one example, you can write a script that uses sed to extract information from the filenames of your mp3z and feed the results into id3ed to tack on an ID3 tag. Try doing that with a batch file.
Re:Why shell? (Score:5, Informative)
why not dynamically compiled? Well if your glibc barfs all dynamically compiled binaries barf with it - including perl, and including any shells that you use to start up your system. With a statically compiled shell to handle all of those startup scripts you can boot linux without glibc working, and you probably have enough of a system still running to get things fixed. with a dynamically linked startup file interpreter, when glibc or something glibc depends on goes, your whole system goes, single-user mode and all.
Besides, while perl can execute system commands and make decisions based on input, i think the shell is a better tool for things like this. sh and bash were designed to do startup scripts (among other things) and they do them well. why fix what isn't broken? shell scripts work, and they can do anything you'd need them to do during startup.
Re:Quick Hacks (Score:1, Informative)
Re:Hmm. (Score:3, Informative)
Easier for people to read, but if you were dumping the results into another program/script, the raw HTML might be easier to parse. (Then again, you can just change -dump to -source to have Lynx dump raw HTML instead of formatted text, in case you don't have wget.)
Re:101 Prompts? (Score:2, Informative)
Re:Hmm. (Score:3, Informative)
wget -c http://www.intuitive.com/wicked/scripts/AllFiles.
Very cool.
pushd and popd (and other tricks) (Score:5, Informative)
Here's other stuff I have grouped by sections in my .cshrc
First, I have my shell variables. The comments say what they do. The most important one is autolist.
Second, bindkeys are pretty neat. I rebind the up and down arrow keys. By default they scroll up and down one at a time through the history. You can bind them to search the history based on what you've typed so far.
Third, completes allow for customizing tab completion. When I change directories, tab only completes directory names. This also works for aliases, sets, setenvs, etc.
Fourth, I have all my aliases. I had to cut a bunch because of the lameness filter.
set -eu (Score:3, Informative)
-e: bomb out immediately when a command exits w/nonzero status
-u: bomb out when de-referencing uninitialized variables ("var=" counts as initialization).
John.
Re:pushd and popd (and other tricks) (Score:3, Informative)
One of the things that a lot of people don't realize is that you can still set the colors and such in tcsh just like in bash - only the syntax is different. Here's how I have mine set on one of the machines I log into:
set prompt = "%{^[[032;1m%}`whoami`%{^[[0m%} %c3 %B%#%b "
Good stuff.
Book Recommendation (Score:5, Informative)
Re:because perl is a pig that runs out of memory (Score:3, Informative)
No, your pathetically puny little machine thrashes and crashes trying to crunch seriously large amounts of data with that script you threw together in 15 minutes while ignoring all the comments in the Camel Book about performance implications of things like "@array = <FILE>". Perl is written in C, and runs just fine when used appropriately.
Re:What about us Windows users?! (Score:2, Informative)
http://www.robvanderwoude.com/ [robvanderwoude.com]
Cool scripts (Score:4, Informative)
Tab completion (Score:3, Informative)
Re:Why shell? (Score:3, Informative)
Re:What about us Windows users?! (Score:1, Informative)
My favourite shell script... (Score:3, Informative)
Yes (Score:3, Informative)
Kjella
/usr and / (Score:3, Informative)
Actually, we decided here ( a fairly large installation) long ago to merge / and /usr.
Our main reason was simplification and this allows us the benefit of not worrying what is in /usr/bin and what is in /bin. (Actually, on Solaris, /bin is just a link to /usr/bin ). Everything in /usr and
/ should not be touched anyhow except through the
normal pkg management tools. We do of course maintain a separate /usr/local.
The main disadvantage is that a fsck would take longer because / is now a large filesystems. With the journaled filesystems of today, we don't see the concern. The other benefit is that we don't need to worry about sizing / and /usr independantly and running out of space in /usr
when / still has lots (or vise versa).
Interestingly enough, /usr/openwin used to be a separate FS on SunOS long ago. The main reasoning was that disks back
then were small and you simply didn't have room to
place /usr/openwin in the /usr/ filesystem.
I would guess that others have merged /usr and /
too but I understand why it's a bit of a
controversial topic.
I do hold disregard for some of the defaults that some Linux distros use where /var and / are merged
as well. In fact, the whole darn OS is in /.
The need for a separate /var should never go away
IMO.
Re:Why shell? (Score:2, Informative)
Useless Use of Cat [wikipedia.org]. sort quote*.txt | uniq
Heh, also a useless use of uniq. Try sort -u quote*.txt
I use sort|uniq, sed -e, and find|xargs multiple times every day. But you will eventually hit the limits given enough experience, and then a mini-program is required. For example, if you want to retain the original order of lines as they are seen, with perl it's just:
(based on "4.6. Extracting Unique Elements from a List" from Perl Cookbook)This can be modified to filter unique lengths, or substrings, or patterns, just by changing $seen{$_} to $seen{lc $_} or whatever you need. If you want sorting the sort -u does, change "print @uniq" to "print sort @uniq". And hey, it's still on the command line!
Re:because perl is a pig that runs out of memory (Score:2, Informative)
Yep, that's why it did this [bioperl.org].
Re:pushd and popd (and other tricks) (Score:1, Informative)
alias po popd
cd
pd
cp *.mp3 =1 # =1 is the first entry on the dirstack
po # returns you back to first place
cd
cd
cp *.mp3 ~-
cd ~-
Re:pushd and popd (and other tricks) (Score:2, Informative)
alias mcd 'md \!*; cd \!*'
alias rcd 'setenv OLD_DIR `pwd`;cd
Usage:
~/> mcd junkDir
~/junk> -- do commands, unzip files, et cetera --
~/junk> rcd
~/> -- back where you were and the dir is gone --
Re:What about us Windows users?! (Score:1, Informative)
Aw, c'mon, what about all those wicked cool games you can play on it? Windows sucks, but it's not totally useless...
Re:My personal favorite; (Score:3, Informative)
But yes, it is a shell fork bomb.
Re:What about us Windows users?! (Score:1, Informative)
Now run it. (Score:2, Informative)
SunOS 5.8 Generic_108528-27 sun4m sparc SUNW,SPARCstation-20
$ which killall
Good for you!
Now type:
$
and post the results after you're done logging back in!
But Asterix is evil (Score:3, Informative)
Re:Why shell? (Score:3, Informative)
#!/usr/bin/perl
system("command 1");
system("command 2");
$some_returned_value = `command 3`;
system("command 4");
Course, you still have to bother with Perl variables. But it should still be possible to do it in 4 lines.
And if he thinks that's a hack, well, Perl is one huge freaking hack. To quote Larry Wall:
"The Amulet isn't exactly beautiful though--in fact, up close it still looks like a bunch of beads melted together. Well, all right, I admit it. It's downright ugly. But never mind that. It's the Magic that counts."
cd with history (Score:5, Informative)
The shell helper that I am totally lost without is one that adds directory history to bash and ksh. You can find it here: _cd [rr.com]
I guess I never really got the idea of a stack of dirs being useful, since I seem to bounce around more at random than anything else. I prefer to have a cache of places I've recently been.
Bonus puzzle for slashdot readers: using the cd with history function, what directory is this command likely to take me to?
Re:What about us Windows users?! (Score:5, Informative)
That's true, but the fallacy is that you're assuming Windows should be scripted in the same fashion as *nix. That's simply not the case. Batch/Command scripting is nice for small bits, and can actually be fairly powerful in an obtuse sort of way, but the real power in automating Windows comes by using the Windows Scripting Host, JScript or VBScript, and all of the ActiveX/COM interfaces into the functionality of the OS and other applications. A classic example is iterating through users. In *nix, you write a shell script to parse through /etc/passwd. In Windows, you write a jscript to instantiate the objects that deal with Active Directory, and iterate through user objects (each of which you can perform actions upon, wherein *nix you'd have to invoke other applications). One approach is not necessarily "better" than the other, but you can't assume that your *nix administration experience will directly translate into Windows administration. You'd laugh if a Windows admin felt the reverse was true. What really gets me is when people complain about Windows not being automation-friendly because they're used to *nix scripting. Yes, you cannot pipe notepad.exe into winword.exe, for example, but Word has a very rich automation interface that you can hook into and use from a simple JScript.
What? Try running "help del" from a cmd.exe window some time. Also, look at "help rd". If you want to remove a directory tree, you use the "remove directory" command. "del" deletes files. "rd" deletes directories (and can delete files within directories if you tell it to).
Consider cmd.exe to be the functional equivalent of csh. It's a decent interactive shell, and has some good functionality (especially later versions of cmd.exe in win2k and xp), but you'd have to be nuts to do any extensive scripting with it. Just as you'd pull out bash or perl to do more complex tasks in *nix rather than using csh, you should use WSH in Windows for more complicated tasks.
The *free* guide to Bash shell scripting (Score:5, Informative)
From the site:
This tutorial assumes no previous knowledge of scripting or programming, but progresses rapidly toward an intermediate/advanced level of instruction (...all the while sneaking in little snippets of UNIX wisdom and lore). It serves as a textbook, a manual for self-study, and a reference and source of knowledge on shell scripting techniques. The exercises and heavily-commented examples invite active reader participation, under the premise that the only way to really learn scripting is to write scripts.
Re:Stupid question - sed on binary files (Score:1, Informative)
If you have traditional Mac end of lines:
You can convert awk to C (Score:2, Informative)
The performance improvement was about 6-7X on my project.
awka [freshmeat.net] does that.
On my project it took less time to convert the awk to C, gcc the C and run the binary than it did to run the perl version.
This is not a perl flame. I am old. I use awk.
Re:shell scripts vs. programming languages... (Score:5, Informative)
Well, I think Silent Bob (garcia) is directly attacking me as the author of these fine tools: Linux tools for geocaching [rkkda.com]
I do know Perl as well as all of the original Unix tools: awk, sed, and of course my favorite: shell.
The short and main explanation is that shell/awk/sed lend themselves very well to what is known as the "Unix tools" approach. Its a way of thinking using a small set of core tools that pays big rewards in productivity.
In the case of my geocaching tools, two things were plainly obvious to me at the start. 1) I would be scraping the pages with curl, because there is no better and easier tool for that job, and 2) gpsbabel would be a main part of almost every tool because it knows how to work with a bajillion waypoint formats. So the onl;y question after that is which language to use to glue those commands together. Shell, awk, and perl can all do that. I used shell to tie it all together because that is what shell is best at.
In a few of the tools, geo-map in particular, I did make a mistake in the glue choice. My excuse there is that it evolved far beyond its original design goals. So it ended up requiring a lot of floating point calculations and therefore I had to run several mini-awk scripts within it. If I were to rewrite it today, I would make it a pure AWK script. Why not perl? Because, IMHO, awk has the cleanest syntax of any of the scripting languages.
So, then, when *would* I use perl? In general, I select perl when 1) there is a pre-written module that does a job that would be hard to do with shell/awk/sed, and 2) the use of that module is truly necessary. That second point is very important to me. The mere existance of a Perl module does not necessarily mean its the best or fastest way to solve the problem.
Perl was the language of choice for "Belle", which is a 4000 line IRC robot I coauthored for use in my daytrading activities. The IRC module was what tipped the scale for Perl in that case.
Another problem with perl modules is that using them guarantees that you will lose some percentage of potential users of your program. Having to find and install additional packages puts many people off. I try to make my scripts completely self-contained (including usage doco) so that people don't have to go thru these hassles.
Anyway, you can argue with any of my points, but what you can't argue with is that I have the largest set of command line tools for geocaching that work, regardless what my language choice was.
-Rick
Re:Is that a recipe for bloat? (Score:2, Informative)
What do you do if you need more control than piping will allow? The difference is between working with data and working with objects. In *nix, you're piping data across processes that act upon that data. If you want to change the password for all of the users on your system, you're iterating through /etc/passwd and passing the username to passwd to make changes. In Windows, you're iterating over a collection of user objects, which have methods you can call to change passwords. You're not reinventing the wheel, because you're not implementing how that password is changed (I've seen the wheel reinvented in unix by trying to encrypt passwords and write those values directly to /etc/passwd or /etc/shadow, bypassing passwd completely). In fact, if you're not using automation objects in your Windows scripting, you're not using "the full potential of what others have developed." If you're writing a unix application, you should make sure it's scriptable via piping. Similarly, if you're writing a Windows application, you should make sure it's scriptable via automation interfaces (IDispatch). One is the standard for *nix and one is the standard for Windows. Nobody is saying that *nix should ditch piping in favor of a COM-like architecture, so why should Windows ditch COM automation in favor of piping?
I'm curious why. Was it too difficult or obscure? Did you have architectural or security objections? Or were you so entrenched in your *nix ways that you couldn't grasp why you should use a different approach in Windows?
MinGW (Score:4, Informative)
The latest version of sh.exe is 465k bytes, it sounds like you have an old version. You should upgrade it. :)
Re:Is that a recipe for bloat? (Score:5, Informative)
The trick with Windows is that you can do many of these same things, but this power comes from doing it in WSH or VB (or C/C++ or an ASP or whatever language you're comfortable with. I've even done it in Perl.) You use the COM interfaces of the shell object to enumerate through directory trees and files. You can stream each of those files into the COM interface of another program that accepts streams. You can search, you can pipe stuff all over, and you're not limited to a single instance of stdin, stdout and/or stderr.
It's not unlike shell scripting, it's just a different language. Each application is able to expose whatever it feels is most important in whatever fashion it thinks is best. DevStudio, for example, lets the scripting host user get to the workspace, the project, and any of the tools.
The biggest problem I have with it is that stdio is not "guaranteed" to be supported by every application under Windows. stdio is the glue that binds all the UNIX utilities together. That's the beauty of stdio -- as the sole mechanism for I/O for most tools, it became the defacto application interaction interface. Windows doesn't have that: most Windows apps don't offer any automated IO at all. And some of the ones that do seem to have interfaces pasted on after the fact. But the ones that do expose properties and methods via COM are easy to access, and easy to control from anywhere. And using the interfaces tends to remove the ambiguities: in UNIX if you're using 'cut' to parse a phone list but the name field sometimes contains commas, you end up hacking around solutions to make them work. A COM-based solution would provide an interface containing a Name field.
Windows is not alone in this limiation, either. UNIX suffers from a similar problem: how do you meaningfully pipe data to and from an X window, or even to a curses app? Is it consistent between apps? Most apps I am familiar with that offer such features in their applications had to have code added to actively support a meaningful commandline interface to their programs through the use of dozens of command line switches. Without this sort of code, using stdio to parse the output of a curses-based application becomes a tedium of screen scraping.
Don't get me wrong: I have a bevy of UNIX-like command line utilities for Windows, I use Cygwin and bash when I need to (although the file system mapping is worse than I could have imagined), and I will fire up a CMD script long before I think to write it as a VB or C++ program. I'm far more comfortable with the sh-style tools -- I grew up with them.
I'm not saying stdio is better or worse than using the COM interfaces of Windows; I'm just saying it's "different." And you certainly shouldn't be reinventing the wheel to script up utilities in Windows.
Re:My personal favorite; (Score:4, Informative)
It reads... define function ':' as follows: pipe the output from function ':' into function ':' -- do that in the background (ie fork). Call the function ':'.
I had no idea how it worked, either, but I looked it up
Technical support for WinBatch? AutoIt (Score:3, Informative)
I've had problems getting technical support for WinBatch. That was a long time ago, maybe things have changed now. There were so many small and big problems that I stopped using WinBatch.
I haven't checked out AutoIt [hiddensoft.com], a free alternative, apparently. From the home page:
"AutoIt is a simple tool that can simulate key presses, mouse movements and window commands (maximize, minimize, wait for, etc.) in order to automate any windows based task (or even windowed DOS tasks)."