Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software Books Media Programming Book Reviews IT Technology

Wicked Cool Shell Scripts 569

norburym writes with a review of Wicked Cool Shell Scripts - 101 Scripts for Linux, Mac OS X, and UNIX Systems. "This incredibly fun book (really!), written by Dave Taylor, a veteran UNIX, Solaris and Mac OS X author, is chock full of 101 scripts to customize the UNIX (Bourne) shell." Read on for the rest.
Wicked Cool Shell Scripts - 101 Scripts for Linux, Mac OS X, and UNIX Systems
author Dave Taylor
pages 368
publisher No Starch Press
rating 10
reviewer Mary Norbury-Glaser
ISBN 1593270127
summary 101 Scripts for Linux, Mac OS X, and UNIX Systems

Chapters are divided into an array of topics sure to catch the attention of any UNIX based system user: The Missing Code Library, Improving on User Commands, Creating Utilities, Tweaking Unix, System Administration: Managing Users, System Administration: System Maintenance, Web and Internet Users, Webmaster Hacks, Web and Internet Administration, Internet Server Administration, Mac OS X Scripts, and Shell Script Fun and Games.

In true "cookbook" fashion, each hack is numbered and divided into The Code, How It Works, Running the Script, The Results and Hacking the Script. Throughout, the author clearly describes the syntax and functionality of each script, often with additional notes in How It Works detailing the syntax process and interesting asides. But Hacking the Script is what gives Wicked Cool Shell Scripts true value; where applicable, the author uses this section to describe script modifications to achieve a variety of alternative real world, practical results. This additional section alone easily triples the total number of scripts the reader is exposed to.

This book enables the reader to get "up close and personal" with their UNIX based system and explore the possibilities afforded by becoming intimate with the command line interface. The reader will find themselves easily propelled into the world of scripting, thanks entirely to Dave Taylor's ability to take what some might describe as a fairly dry topic and translate it into a logical and user friendly construct. Just reading through the table of contents is inspiring and intriguing; did you know you could write a script to retrieve movie info from IMDb? or track the value of your stock portfolio? or that you can use a very simple script to check spelling on your web pages?

Sysadmins and webmasters will find this book fundamentally critical to day-to-day operations; there are dozens of invaluable, customizable scripts highlighted in this book to enable professionals to save time and add simple, elegant solutions to annoying issues in their work environment. User account management, rotating log files, cron scripts, web page tweaks, apache passwords, synchronizing via ftp, etc. are all eminently useful and tweakable.

Geeky home users will discover they can use these scripts to work with files and directories, create spell-checking utilities, calculate loan payments, create summary listings of their iTunes libraries, and of course, play games. Many of the sysadmin scripts would also be of interest to the power user: analyzing disk usage, killing processes by name and backing up directories, to name a few. Both types of users will find this book inspiring and truly fun!

One of the secret pleasures of a technical book reviewer is finding those wonky bits of code that suffer from misplaced or missing punctuation, misspelled words and other basic typographic errors inherent in the book publishing process. I randomly selected many of these scripts to try out in the process of doing this review and...dang, haven't found any errata yet. But be sure to check out the errata page on Dave Taylor's web site for any that more astute readers may find (there were none, as of this writing).

Also be sure to take a closer look at Dave's shell script library, which lists additional scripts that didn't make the cut for the book. As convenient as it is to download the entire script library, I would like to stress the value of buying the book, which will provide you with invaluable instruction and guidance in understanding the syntax of the scripts and it also illustrates how making small but significant tweaks can modify the output to match your specific needs.

(A special nod of appreciation to Dave Taylor's Tintin references!)


You can purchase Wicked Cool Shell Scripts - 101 Scripts for Linux, Mac OS X, and UNIX Systems from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Wicked Cool Shell Scripts

Comments Filter:
  • by prgrmr ( 568806 ) on Wednesday March 10, 2004 @03:05PM (#8523594) Journal
    Get a copy of Windows Admin Scripting Little Black Book [amazon.com], or something similar. I got a copy of the first edition at Borders for $5, you may find similar on ebay or half.com.
  • Quick Hacks (Score:5, Informative)

    by frodo from middle ea ( 602941 ) on Wednesday March 10, 2004 @03:06PM (#8523604) Homepage
    My 2 cent tips on budding shell script authors.

    If the script is not working as you want, put a

    set -x
    on the fist line and
    set +x
    on the last line.

    You will see the exact execution path and variable expansion, very neat for debugging

  • by Anonymous Coward on Wednesday March 10, 2004 @03:08PM (#8523627)
    Hmmm... I dunno... From the name of the book I'd have to say BASH (with some possible TCSH or ZSH thrown in for measure)....
  • by Anonymous Coward on Wednesday March 10, 2004 @03:11PM (#8523662)
    You might want to read imdb's robot.txt [imdb.com] before using wget.
  • Re:Quick Hacks (Score:1, Informative)

    by Anonymous Coward on Wednesday March 10, 2004 @03:12PM (#8523680)
    #!/bin/bash -x
    echo works well for the first line too
  • Re:Quick Hacks (Score:1, Informative)

    by Anonymous Coward on Wednesday March 10, 2004 @03:13PM (#8523692)
    And if your script has way too much output this way, you can add comments with the no-op operator, : , as follows:

    : THIS LINE DOES FOO

    the above line does nothing and is only printed if set -x is on...

  • Re:RTFT (Score:3, Informative)

    by KingOfBLASH ( 620432 ) on Wednesday March 10, 2004 @03:13PM (#8523695) Journal

    I RTFT. Shell scripts can be BASH, CSH, TSH, SH, KSH, the list goes on. That's if you're assuming by "shell" they don't mean all interpreted languages -- I've seen a number of Perl / Python / Whatever scripts put in the general category of shell scripts before.

    I am also curious what tools are assumed to be available to the user. There are a lot of programs available that are standard (i.e. wget), but is the author assuming a standard unix distribution, or does he say "if you have ___"?

  • Re:Quick Hacks (Score:5, Informative)

    by Stinky Cheese Man ( 548499 ) on Wednesday March 10, 2004 @03:15PM (#8523713)
    In bash, at least, you can do this even more simply with...

    sh -x scriptname

  • by brunson ( 91995 ) * on Wednesday March 10, 2004 @03:17PM (#8523730) Homepage
    Because perl is hidously ugly. The vice grips of Unix and just like vice grips in the real world, there's always a better tool for the job.

    http://www.python.org /me waits to be modded 'troll'
  • Re:Why shell? (Score:5, Informative)

    by dewie ( 685736 ) <dbscully AT gmail DOT com> on Wednesday March 10, 2004 @03:17PM (#8523734)
    Because:

    (i) Many people, like myself, don't know perl, and don't see the point in learning when shell scripts are perfectly adequate for their purposes.

    (ii) Sometimes it's just easier. viz. this quote [bash.org] from bash.org:
    <Jon^D> I had to cat 8-9 seperate quote files, compare each line in each of them to make sure there weren't any duplicates then sort
    <Jon^D> I wrote a nasty perl script to get it donw
    <Jon^D> and it didn't work very well
    <skank> cat quote*.txt |sort |uniq
  • Re:101 Prompts? (Score:5, Informative)

    by Kaimelar ( 121741 ) on Wednesday March 10, 2004 @03:18PM (#8523753) Homepage
    There needs to be a chapter on bash prompts. I have seen some slick prompts. Displaying; uptime, current directory size, time, battery power, etc. I'm pretty satisfied with a user@host:~, but i do like to put color in mine.

    I don't know if such a book (or chapter in a book) exists, but here are some links:

    Have fun...

  • by TMOLI 42 ( 202038 ) on Wednesday March 10, 2004 @03:18PM (#8523758) Journal
    I already know too many computer languages to take the time to learn a new one just to use a script
    Don't you really mean "I already know too many computer languages so learning another is not a problem"? I know maybe 10 or so, and I can't profess to know everything, but I think after you understand the basic concepts it doesn't matter what language it is in. Subprograms, objects, conditionals, and looping are the same concepts regardless of the language used; only the syntax is different. (Of course, there are always exceptions).
    since they varied the use of scripting languages, not everything resulted in something I could use.
    The point is the opposite of what you said. Some things resulted in things you could use, as opposed to nothing if you did not know the language used.
  • Re:Webmasters?? (Score:2, Informative)

    by itsabouttime ( 738917 ) on Wednesday March 10, 2004 @03:21PM (#8523783)
    SHELL Scripts are used by webmasters for exactly the purposes you have mentioned. Mostly for managing files particuliarly logs, starting and stoping servers, jvm's, creating backups etc. Not for web users... I can't imagine anyone is suggesting that Shell scripts be be granted execute permission on a web server... I hope.
  • Re:101 Prompts? (Score:3, Informative)

    by deadlinegrunt ( 520160 ) on Wednesday March 10, 2004 @03:22PM (#8523791) Homepage Journal
    Try this [tldp.org].

    Amazing how much free information you can get when you look for it.

  • Not ubiquitous (Score:1, Informative)

    by Anonymous Coward on Wednesday March 10, 2004 @03:23PM (#8523808)
    sh and vi are ubiquitous, perl is not.

    For instance, FreeBSD recently took perl out of the base package. And perl wouldn't be my candidate for inclusion on a boot floppy.
  • by ncc74656 ( 45571 ) * <scott@alfter.us> on Wednesday March 10, 2004 @03:27PM (#8523854) Homepage Journal
    I could use some wicked cool batch files.

    Cygwin [cygwin.com] is your friend. For just one example, you can write a script that uses sed to extract information from the filenames of your mp3z and feed the results into id3ed to tack on an ID3 tag. Try doing that with a batch file.

  • Re:Why shell? (Score:5, Informative)

    by Naikrovek ( 667 ) <jjohnson.psg@com> on Wednesday March 10, 2004 @03:28PM (#8523869)
    usually the really important scripts are running /bin/sh - a STATICALLY compiled binary of a shell that is pretty much everywhere.

    why not dynamically compiled? Well if your glibc barfs all dynamically compiled binaries barf with it - including perl, and including any shells that you use to start up your system. With a statically compiled shell to handle all of those startup scripts you can boot linux without glibc working, and you probably have enough of a system still running to get things fixed. with a dynamically linked startup file interpreter, when glibc or something glibc depends on goes, your whole system goes, single-user mode and all.

    Besides, while perl can execute system commands and make decisions based on input, i think the shell is a better tool for things like this. sh and bash were designed to do startup scripts (among other things) and they do them well. why fix what isn't broken? shell scripts work, and they can do anything you'd need them to do during startup.
  • Re:Quick Hacks (Score:1, Informative)

    by Anonymous Coward on Wednesday March 10, 2004 @03:34PM (#8523926)
    that, and when the exec bit is set on the shell script and the script is directly run as # ./shellscript.
  • Re:Hmm. (Score:3, Informative)

    by ncc74656 ( 45571 ) * <scott@alfter.us> on Wednesday March 10, 2004 @03:36PM (#8523942) Homepage Journal
    You mean:

    #!/bin/sh
    lynx -dump 'http://imdb.com/title/tt0151804/'

    :-) Much easier to read.

    Easier for people to read, but if you were dumping the results into another program/script, the raw HTML might be easier to parse. (Then again, you can just change -dump to -source to have Lynx dump raw HTML instead of formatted text, in case you don't have wget.)

  • Re:101 Prompts? (Score:2, Informative)

    by Otter ( 3800 ) on Wednesday March 10, 2004 @03:38PM (#8523964) Journal
    This was also an Ask Slashdot a bunch of years ago...OK, fine, I'll find it [slashdot.org]...
  • Re:Hmm. (Score:3, Informative)

    by bfg9000 ( 726447 ) on Wednesday March 10, 2004 @03:40PM (#8523993) Homepage Journal
    Just snagged all the scripts in one file with a wget script too.

    wget -c http://www.intuitive.com/wicked/scripts/AllFiles.t gz

    Very cool.
  • by Komi ( 89040 ) on Wednesday March 10, 2004 @03:41PM (#8524004) Homepage
    I've read throught the tcsh man pages and stole from other people and probably the least-known most useful trick I've found is pushd and popd (which I realias to pd and po), and of course directory stack substitution. Here's a snippet of code that's really useful:
    alias pd pushd
    alias po popd
    cd /incredi/bly/long/path/name
    pd /some/other/incredi/bly/long/path/name
    cp *.mp3 =1 # =1 is the first entry on the dirstack
    po # returns you back to first place
    The other major time saver I use are sed and awk. I used each for a specific purpose. Sed works great for substitution, and awk I use to grab columns of data. Here's a sample of how I'd use both together. This will list the home directories of the users on a machine. It's simple, but there's a ton you can do with this technique.
    who | awk '{print $1}' | sort | uniq | sed 's@^@/home/@g'

    Here's other stuff I have grouped by sections in my .cshrc

    First, I have my shell variables. The comments say what they do. The most important one is autolist.

    set autolist # automatically lists possibilities after ambiguous completion
    set dunique # removes duplicate entries in the dirstack
    set fignore=(\~) # files ending in ~ will be ignored by completion
    set histdup=prev # do not allow consecutive duplicate history entries
    set noclobber # output redirection will not overwrite an existing file
    set notify # notifies when a job completes
    set symlinks=ignore # treats symbolic directories like real directories
    set time=5 # processes that run longer than $time seconds will be timed.

    Second, bindkeys are pretty neat. I rebind the up and down arrow keys. By default they scroll up and down one at a time through the history. You can bind them to search the history based on what you've typed so far.

    bindkey -k up history-search-backward # up arrow key
    bindkey -k down history-search-forward # down arrow key

    Third, completes allow for customizing tab completion. When I change directories, tab only completes directory names. This also works for aliases, sets, setenvs, etc.

    complete cd 'p/1/d/'
    complete alias 'p/1/a/'
    complete setenv 'p/1/e/'
    complete set 'p/1/s/'

    Fourth, I have all my aliases. I had to cut a bunch because of the lameness filter.

    alias cwdcmd 'ls'
    alias precmd 'echo -n "\033]0;$USER@`hostname` : $PWD\007"'
    alias pd 'pushd'
    alias po 'popd'
    alias dirs 'dirs -v'
    alias path 'printf "${PATH:as/:/\n/}\n"'
    alias ff 'find . -name '\''\!:1'\'' -print \!:2*'
    alias aw 'awk '\''{print $'\!:1'}'\'''
    alias sub 'sed "s@"\!:1"@"\!:2"@g"'
  • set -eu (Score:3, Informative)

    by jlusk4 ( 2831 ) on Wednesday March 10, 2004 @03:47PM (#8524058)
    I like this one, too.

    -e: bomb out immediately when a command exits w/nonzero status

    -u: bomb out when de-referencing uninitialized variables ("var=" counts as initialization).

    John.
  • by nitehorse ( 58425 ) <clee@c133.org> on Wednesday March 10, 2004 @03:59PM (#8524167)
    I love tcsh.

    One of the things that a lot of people don't realize is that you can still set the colors and such in tcsh just like in bash - only the syntax is different. Here's how I have mine set on one of the machines I log into:

    set prompt = "%{^[[032;1m%}`whoami`%{^[[0m%} %c3 %B%#%b "

    Good stuff.
  • Book Recommendation (Score:5, Informative)

    by Neil Watson ( 60859 ) on Wednesday March 10, 2004 @03:59PM (#8524169) Homepage
    If you like scripting and all things Unix I highly recommned Unix Power Tools [oreilly.com]. I bought a copy last month. All the things about Unix that could not necessarily fill an entire book other their own nicely packaged together.
  • by elmegil ( 12001 ) on Wednesday March 10, 2004 @04:01PM (#8524186) Homepage Journal
    perl thrashes and crashes trying to crunch seriously large amounts of data.

    No, your pathetically puny little machine thrashes and crashes trying to crunch seriously large amounts of data with that script you threw together in 15 minutes while ignoring all the comments in the Camel Book about performance implications of things like "@array = <FILE>". Perl is written in C, and runs just fine when used appropriately.

  • by denis-The-menace ( 471988 ) on Wednesday March 10, 2004 @04:02PM (#8524191)
    Save your dime and go here.
    http://www.robvanderwoude.com/ [robvanderwoude.com]
  • Cool scripts (Score:4, Informative)

    by MarkSfromAR ( 307115 ) on Wednesday March 10, 2004 @04:03PM (#8524204)
    Another good book is The Unix and X Command Compendium [amazon.com] Shows shell commands, and explains what they do. A very good Unix reference book.
  • Tab completion (Score:3, Informative)

    by Spy Hunter ( 317220 ) on Wednesday March 10, 2004 @04:03PM (#8524210) Journal
    In Debian, the Bash package comes with a totally awesome collection of customized tab completions. For some reason, they are not turned on by default. To turn them on in a single account, you can put the line "source /etc/bash_completion" in your ~/.bashrc file, or you can turn them on globally by editing the /etc/bash.bashrc file and uncommenting the relevant lines. You'll get magic smart tab completion for cd, apt-get, ssh, mplayer, and bajillions of other programs, and you'll wonder how you ever did without it. apt-get tab completion in particular rocks like nothing else. For example, if you type "apt-get remove x[TAB]" you'll get a complete list of installed packages starting with x. When installing, you'll get a list of available but not yet installed packages. I can't stand using apt-get without tab completion anymore.
  • Re:Why shell? (Score:3, Informative)

    by mst76 ( 629405 ) on Wednesday March 10, 2004 @04:03PM (#8524211)
    No. You do not have to settle for less. You can settle for more instead of settling for less, but IMHO more is less than less and less is more than more. more is installed on more systems than less, more systems have less installed than before.
    Why settle for less, if you can settle on most [jedsoft.org]?
  • by zonker ( 1158 ) on Wednesday March 10, 2004 @04:05PM (#8524225) Homepage Journal
    I know you're joking, but try WilsonWindowWare's (remember them from the old days?!) WinBatch [winbatch.com]. If you can get around the pricetag for the compiler ($99, not too bad), you'll find a really cool utility...
  • by cperciva ( 102828 ) on Wednesday March 10, 2004 @04:09PM (#8524266) Homepage
    ... is FreeBSD Update [daemonology.net]. 700 lines of shell code to fetch, install, and rollback security updates to an entire operating system.
  • Yes (Score:3, Informative)

    by Kjella ( 173770 ) on Wednesday March 10, 2004 @04:12PM (#8524301) Homepage
    Yes - but only the day after you buy it.

    Kjella
  • /usr and / (Score:3, Informative)

    by dr-suess-fan ( 210327 ) on Wednesday March 10, 2004 @04:13PM (#8524310)

    Actually, we decided here ( a fairly large installation) long ago to merge / and /usr.

    Our main reason was simplification and this allows us the benefit of not worrying what is in /usr/bin and what is in /bin. (Actually, on Solaris, /bin is just a link to /usr/bin ). Everything in /usr and / should not be touched anyhow except through the normal pkg management tools. We do of course maintain a separate /usr/local.

    The main disadvantage is that a fsck would take longer because / is now a large filesystems. With the journaled filesystems of today, we don't see the concern. The other benefit is that we don't need to worry about sizing / and /usr independantly and running out of space in /usr when / still has lots (or vise versa).

    Interestingly enough, /usr/openwin used to be a separate FS on SunOS long ago. The main reasoning was that disks back then were small and you simply didn't have room to place /usr/openwin in the /usr/ filesystem.

    I would guess that others have merged /usr and / too but I understand why it's a bit of a controversial topic.

    I do hold disregard for some of the defaults that some Linux distros use where /var and / are merged as well. In fact, the whole darn OS is in /. The need for a separate /var should never go away IMO.

  • Re:Why shell? (Score:2, Informative)

    by chad_r ( 79875 ) on Wednesday March 10, 2004 @04:21PM (#8524422)

    Useless Use of Cat [wikipedia.org]. sort quote*.txt | uniq

    Heh, also a useless use of uniq. Try sort -u quote*.txt

    I use sort|uniq, sed -e, and find|xargs multiple times every day. But you will eventually hit the limits given enough experience, and then a mini-program is required. For example, if you want to retain the original order of lines as they are seen, with perl it's just:

    perl -e '@uniq = grep { ! $seen{$_} ++ } <>; print @uniq' quote*.txt
    (based on "4.6. Extracting Unique Elements from a List" from Perl Cookbook)

    This can be modified to filter unique lengths, or substrings, or patterns, just by changing $seen{$_} to $seen{lc $_} or whatever you need. If you want sorting the sort -u does, change "print @uniq" to "print sort @uniq". And hey, it's still on the command line!

  • by iCat ( 690740 ) on Wednesday March 10, 2004 @04:22PM (#8524446)
    perl thrashes and crashes trying to crunch seriously large amounts of data

    Yep, that's why it did this [bioperl.org].
  • by Anonymous Coward on Wednesday March 10, 2004 @04:25PM (#8524497)
    alias pd pushd
    alias po popd
    cd /incredi/bly/long/path/name
    pd /some/other/incredi/bly/long/path/name
    cp *.mp3 =1 # =1 is the first entry on the dirstack
    po # returns you back to first place


    cd /some/directory/
    cd /another/directory/
    cp *.mp3 ~-
    cd ~-
  • by MasterLock ( 581630 ) on Wednesday March 10, 2004 @04:33PM (#8524577)
    Two of my most handy aliases (tcsh and 4DOS/4NT) are:

    alias mcd 'md \!*; cd \!*'
    alias rcd 'setenv OLD_DIR `pwd`;cd ..;echo $OLD_DIR;rd "$OLD_DIR"; unsetenv OLD_DIR'

    Usage:
    ~/> mcd junkDir
    ~/junk> -- do commands, unzip files, et cetera --
    ~/junk> rcd
    ~/> -- back where you were and the dir is gone --

  • by Anonymous Coward on Wednesday March 10, 2004 @04:54PM (#8524822)
    "Wicked cool" and "Windows" just don't go together....

    Aw, c'mon, what about all those wicked cool games you can play on it? Windows sucks, but it's not totally useless...

  • by pclminion ( 145572 ) on Wednesday March 10, 2004 @04:55PM (#8524830)
    Those interesting in testing this should probably set 'ulimit -u 10' or some such thing...

    But yes, it is a shell fork bomb.

  • by Anonymous Coward on Wednesday March 10, 2004 @04:56PM (#8524840)
    A great book for Windows is Que's "Windows XP Under the Hood", by Brian Knittel. It's divided into two parts: (1) all about making the most of the CMD.EXE batch language (which has come a looong way since DOS or Win95 and is seriously much more powerful than you probably suspect), and (2) using various scripting languages (mostly VBScript, but also enough to get you started with JScript, ActivePerl, ActivePython, and REXX) with the Windows Script Host (WSH) and making it do all kinds of cool stuff with the many COM/OLE automation objects that come with WinXP. It's well-written and covers the material from the beginning, suitable for readers who preferably have some programming experience but are totally new to the whole Windows scripting and COM/OLE thing.
  • Now run it. (Score:2, Informative)

    by Chris Burke ( 6130 ) on Wednesday March 10, 2004 @05:01PM (#8524910) Homepage
    $ uname -a
    SunOS 5.8 Generic_108528-27 sun4m sparc SUNW,SPARCstation-20
    $ which killall /usr/sbin/killall


    Good for you!

    Now type:
    $ /usr/sbin/killall

    and post the results after you're done logging back in! :)

  • But Asterix is evil (Score:3, Informative)

    by Götz ( 18854 ) <`ten.xmg' `ta' `khcsaw'> on Wednesday March 10, 2004 @05:04PM (#8524942) Homepage
    Don't you remember the Mobilix case? This web site about Unix on mobile computers was sued by the publishers of Asterix, as their name was too similar to that comic character. They've lost. They had to move the site to Tuxmobil.org [tuxmobil.org].
  • Re:Why shell? (Score:3, Informative)

    by saforrest ( 184929 ) on Wednesday March 10, 2004 @05:11PM (#8525048) Journal
    So then you go with

    #!/usr/bin/perl
    system("command 1");
    system("command 2");
    $some_returned_value = `command 3`;
    system("command 4");

    Course, you still have to bother with Perl variables. But it should still be possible to do it in 4 lines.

    And if he thinks that's a hack, well, Perl is one huge freaking hack. To quote Larry Wall:

    "The Amulet isn't exactly beautiful though--in fact, up close it still looks like a bunch of beads melted together. Well, all right, I admit it. It's downright ugly. But never mind that. It's the Magic that counts."
  • cd with history (Score:5, Informative)

    by Rick Richardson ( 87058 ) on Wednesday March 10, 2004 @05:14PM (#8525087) Homepage

    The shell helper that I am totally lost without is one that adds directory history to bash and ksh. You can find it here: _cd [rr.com]

    # Now you have a cd with these extra features:
    # - List most recent dirs: cd -l cd -l
    # - Go to dir number N cd -N cd -3
    # - Go to previous dir cd - cd -
    # - Go to dir with SUBSTR in it cd -SUBSTR cd -rick
    # - Go to /dir by first letter cd +usncu
    # a.k.a. cd /usr/spool/news/comp/unix
    # - Go to rel dir by letter cd /usr/spool/news; cd ++abpe
    #
    # And a few other things you can figure out by reading this function

    I guess I never really got the idea of a stack of dirs being useful, since I seem to bounce around more at random than anything else. I prefer to have a cache of places I've recently been.

    Bonus puzzle for slashdot readers: using the cd with history function, what directory is this command likely to take me to?

    cd +usnabpe
  • by Osty ( 16825 ) on Wednesday March 10, 2004 @05:27PM (#8525236)

    Exactly. What makes shell scripts so much better on Unix isn't really the shell, it's the flexibility of the programs that come with a Unix.

    That's true, but the fallacy is that you're assuming Windows should be scripted in the same fashion as *nix. That's simply not the case. Batch/Command scripting is nice for small bits, and can actually be fairly powerful in an obtuse sort of way, but the real power in automating Windows comes by using the Windows Scripting Host, JScript or VBScript, and all of the ActiveX/COM interfaces into the functionality of the OS and other applications. A classic example is iterating through users. In *nix, you write a shell script to parse through /etc/passwd. In Windows, you write a jscript to instantiate the objects that deal with Active Directory, and iterate through user objects (each of which you can perform actions upon, wherein *nix you'd have to invoke other applications). One approach is not necessarily "better" than the other, but you can't assume that your *nix administration experience will directly translate into Windows administration. You'd laugh if a Windows admin felt the reverse was true. What really gets me is when people complain about Windows not being automation-friendly because they're used to *nix scripting. Yes, you cannot pipe notepad.exe into winword.exe, for example, but Word has a very rich automation interface that you can hook into and use from a simple JScript.


    Take the DELETE command. It has trouble deleting multiple files at a time. It can't delete directories. Then look at Unix rm. It's easy to see why batch files are a joke.

    What? Try running "help del" from a cmd.exe window some time. Also, look at "help rd". If you want to remove a directory tree, you use the "remove directory" command. "del" deletes files. "rd" deletes directories (and can delete files within directories if you tell it to).


    The shell itself is definitely more flexible overall, though. Definitely more scriptable. The Bourne way of doing conditions, loops, pipes and whatnot are definitely more intuitive, more flexible, and carry less baggage than command.com or cmd.exe.

    Consider cmd.exe to be the functional equivalent of csh. It's a decent interactive shell, and has some good functionality (especially later versions of cmd.exe in win2k and xp), but you'd have to be nuts to do any extensive scripting with it. Just as you'd pull out bash or perl to do more complex tasks in *nix rather than using csh, you should use WSH in Windows for more complicated tasks.

  • by lysium ( 644252 ) on Wednesday March 10, 2004 @05:43PM (#8525400)
    Why buy a book on shell scripting? Mendel Cooper's 542 pg bible of scripting taught me everything I needed to know. It is a free download, found here [ilug-bom.org.in]. You can find it in an easy-to-print PDF as well.

    From the site:
    This tutorial assumes no previous knowledge of scripting or programming, but progresses rapidly toward an intermediate/advanced level of instruction (...all the while sneaking in little snippets of UNIX wisdom and lore). It serves as a textbook, a manual for self-study, and a reference and source of knowledge on shell scripting techniques. The exercises and heavily-commented examples invite active reader participation, under the premise that the only way to really learn scripting is to write scripts.

  • by Anonymous Coward on Wednesday March 10, 2004 @05:58PM (#8525614)
    Perhaps pipe the sed output into:
    tr -d '\012'

    If you have traditional Mac end of lines:

    tr -d '\015'
  • by freelunch ( 258011 ) on Wednesday March 10, 2004 @06:02PM (#8525663)
    On the subject of 'cool shell scripts', converting your awk to C and compiling it is pretty damn cool.

    The performance improvement was about 6-7X on my project.

    awka [freshmeat.net] does that.

    On my project it took less time to convert the awk to C, gcc the C and run the binary than it did to run the perl version.

    This is not a perl flame. I am old. I use awk.
  • by Rick Richardson ( 87058 ) on Wednesday March 10, 2004 @06:08PM (#8525730) Homepage

    Well, I think Silent Bob (garcia) is directly attacking me as the author of these fine tools: Linux tools for geocaching [rkkda.com]

    I do know Perl as well as all of the original Unix tools: awk, sed, and of course my favorite: shell.

    The short and main explanation is that shell/awk/sed lend themselves very well to what is known as the "Unix tools" approach. Its a way of thinking using a small set of core tools that pays big rewards in productivity.

    In the case of my geocaching tools, two things were plainly obvious to me at the start. 1) I would be scraping the pages with curl, because there is no better and easier tool for that job, and 2) gpsbabel would be a main part of almost every tool because it knows how to work with a bajillion waypoint formats. So the onl;y question after that is which language to use to glue those commands together. Shell, awk, and perl can all do that. I used shell to tie it all together because that is what shell is best at.

    In a few of the tools, geo-map in particular, I did make a mistake in the glue choice. My excuse there is that it evolved far beyond its original design goals. So it ended up requiring a lot of floating point calculations and therefore I had to run several mini-awk scripts within it. If I were to rewrite it today, I would make it a pure AWK script. Why not perl? Because, IMHO, awk has the cleanest syntax of any of the scripting languages.

    So, then, when *would* I use perl? In general, I select perl when 1) there is a pre-written module that does a job that would be hard to do with shell/awk/sed, and 2) the use of that module is truly necessary. That second point is very important to me. The mere existance of a Perl module does not necessarily mean its the best or fastest way to solve the problem.

    Perl was the language of choice for "Belle", which is a 4000 line IRC robot I coauthored for use in my daytrading activities. The IRC module was what tipped the scale for Perl in that case.

    Another problem with perl modules is that using them guarantees that you will lose some percentage of potential users of your program. Having to find and install additional packages puts many people off. I try to make my scripts completely self-contained (including usage doco) so that people don't have to go thru these hassles.

    Anyway, you can argue with any of my points, but what you can't argue with is that I have the largest set of command line tools for geocaching that work, regardless what my language choice was.

    -Rick

  • by Osty ( 16825 ) on Wednesday March 10, 2004 @06:46PM (#8526161)

    One of the things that make Unix truly great is the possibility of piping one program's output into another. Use the full potential of what others have developed, don't reinvent the wheel.

    What do you do if you need more control than piping will allow? The difference is between working with data and working with objects. In *nix, you're piping data across processes that act upon that data. If you want to change the password for all of the users on your system, you're iterating through /etc/passwd and passing the username to passwd to make changes. In Windows, you're iterating over a collection of user objects, which have methods you can call to change passwords. You're not reinventing the wheel, because you're not implementing how that password is changed (I've seen the wheel reinvented in unix by trying to encrypt passwords and write those values directly to /etc/passwd or /etc/shadow, bypassing passwd completely). In fact, if you're not using automation objects in your Windows scripting, you're not using "the full potential of what others have developed." If you're writing a unix application, you should make sure it's scriptable via piping. Similarly, if you're writing a Windows application, you should make sure it's scriptable via automation interfaces (IDispatch). One is the standard for *nix and one is the standard for Windows. Nobody is saying that *nix should ditch piping in favor of a COM-like architecture, so why should Windows ditch COM automation in favor of piping?


    It was after reading a book on ActiveX that I definitely quit on Windows programming.

    I'm curious why. Was it too difficult or obscure? Did you have architectural or security objections? Or were you so entrenched in your *nix ways that you couldn't grasp why you should use a different approach in Windows?

  • MinGW (Score:4, Informative)

    by mark_space2001 ( 570644 ) on Wednesday March 10, 2004 @06:57PM (#8526269)
    The good folks at MinGW.org [mingw.org] make a package called MSYS that has most of the binutils and runs as native apps under windows. I use it a lot, it's really handy when I want to download and open a bzipped tar file on windows.

    The latest version of sh.exe is 465k bytes, it sounds like you have an old version. You should upgrade it. :)

  • by plover ( 150551 ) * on Wednesday March 10, 2004 @07:38PM (#8526682) Homepage Journal
    While I agree with you that piping the output of one program into another to stack utility upon utility is a great feature of [c|k|ba]sh, I don't think you weren't paying attention to the parent post.

    The trick with Windows is that you can do many of these same things, but this power comes from doing it in WSH or VB (or C/C++ or an ASP or whatever language you're comfortable with. I've even done it in Perl.) You use the COM interfaces of the shell object to enumerate through directory trees and files. You can stream each of those files into the COM interface of another program that accepts streams. You can search, you can pipe stuff all over, and you're not limited to a single instance of stdin, stdout and/or stderr.

    It's not unlike shell scripting, it's just a different language. Each application is able to expose whatever it feels is most important in whatever fashion it thinks is best. DevStudio, for example, lets the scripting host user get to the workspace, the project, and any of the tools.

    The biggest problem I have with it is that stdio is not "guaranteed" to be supported by every application under Windows. stdio is the glue that binds all the UNIX utilities together. That's the beauty of stdio -- as the sole mechanism for I/O for most tools, it became the defacto application interaction interface. Windows doesn't have that: most Windows apps don't offer any automated IO at all. And some of the ones that do seem to have interfaces pasted on after the fact. But the ones that do expose properties and methods via COM are easy to access, and easy to control from anywhere. And using the interfaces tends to remove the ambiguities: in UNIX if you're using 'cut' to parse a phone list but the name field sometimes contains commas, you end up hacking around solutions to make them work. A COM-based solution would provide an interface containing a Name field.

    Windows is not alone in this limiation, either. UNIX suffers from a similar problem: how do you meaningfully pipe data to and from an X window, or even to a curses app? Is it consistent between apps? Most apps I am familiar with that offer such features in their applications had to have code added to actively support a meaningful commandline interface to their programs through the use of dozens of command line switches. Without this sort of code, using stdio to parse the output of a curses-based application becomes a tedium of screen scraping.

    Don't get me wrong: I have a bevy of UNIX-like command line utilities for Windows, I use Cygwin and bash when I need to (although the file system mapping is worse than I could have imagined), and I will fire up a CMD script long before I think to write it as a VB or C++ program. I'm far more comfortable with the sh-style tools -- I grew up with them.

    I'm not saying stdio is better or worse than using the COM interfaces of Windows; I'm just saying it's "different." And you certainly shouldn't be reinventing the wheel to script up utilities in Windows.

  • by 26199 ( 577806 ) * on Wednesday March 10, 2004 @09:54PM (#8527893) Homepage
    :(){ :|:&};:

    It reads... define function ':' as follows: pipe the output from function ':' into function ':' -- do that in the background (ie fork). Call the function ':'.

    I had no idea how it worked, either, but I looked it up :-)
  • by Futurepower(R) ( 558542 ) on Wednesday March 10, 2004 @10:37PM (#8528177) Homepage

    I've had problems getting technical support for WinBatch. That was a long time ago, maybe things have changed now. There were so many small and big problems that I stopped using WinBatch.

    I haven't checked out AutoIt [hiddensoft.com], a free alternative, apparently. From the home page:

    "AutoIt is a simple tool that can simulate key presses, mouse movements and window commands (maximize, minimize, wait for, etc.) in order to automate any windows based task (or even windowed DOS tasks)."

Ya'll hear about the geometer who went to the beach to catch some rays and became a tangent ?

Working...