Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
GNU is Not Unix Programming Unix Linux

Adding Some Spice To *nix Shell Scripts 411

An anonymous reader writes "Developing GUI script-based applications is time-consuming and expensive. Most Unix-based scripts run in a CLI mode or over a secure ssh session. The Unix shells are quite sophisticated programming languages in their own right: they are easy to design and quick to build, but they are not user-friendly in the same way the Unix commands aren't (see the Unix haters books). Both Unix and bash provide features for writing user friendly scripts using various tools to build powerful, interactive, user-friendly scripts that run under the bash shell on Linux or Unix. What tools do you use that spice up your scripts on the Linux or Unix platforms?"
This discussion has been archived. No new comments can be posted.

Adding Some Spice To *nix Shell Scripts

Comments Filter:
  • Pashua on OS X (Score:3, Informative)

    by iliketrash ( 624051 ) on Monday April 19, 2010 @07:19PM (#31903940)

    On OS X, I use Pashua, http://www.bluem.net/en/mac/pashua/ [bluem.net]. This is a brilliantly simple thing to use. I also use it for other (non-script) languages for making a quick-and-dirty GUI that still looks nice and is a real Cocoa program.

  • by hkz ( 1266066 ) on Monday April 19, 2010 @07:37PM (#31904130)

    I work as a Linux netadmin and system developer, so I do a lot of shell programming in (ba)sh. Here's some of the niftier things you can do to a shell script:

    - Make colored output with shell escape sequences. Works especially well with red type for error messages, makes them really stand out.
    - Use inline functions a lot. The great thing about them is that you can pipe to them and use them in all kinds of places. For instance, to do mysql queries:

    mysql_query() { /usr/bin/mysql --user=root --pass=topsecret database

    echo 'SELECT * FROM accounts' | mysql_query

    - "Here documents". For long MySQL sequences, something like the following (reusing the mysql_query function from above):

    cat - EOF | mysql_query
          SELECT bar
          FROM foo
          WHERE baz ...

    This lets you easily format stdin for scripts and other programs. Also really useful for outputting HTML and stuff like that. Best thing is that variables are expanded inside the block.

    - The || and && operators. Check if a file exists, remove if so, else complain:
    [ -f /tmp/somefile.txt ] && rm /tmp/somefile.txt || echo "Does not exist!"

    Also common in this form:
    [ -x /usr/bin/necessaryprogram ] || { echo "aaargh"; exit 1; }

    - Making a "multithreaded" shellscript is also one of my favourites. Say, you want to start five virtual machines at the same time. Write a function that starts a vm, and call it a few times in a loop, backgrounding each instance with &, and saving their PIDs. Then have a "wait loop" that waits for the PIDs to exit the system (or for a timeout to occur).

    - Proper argument handling with getopt. Have your script take "real" arguments in any order, just like real binaries.

    This just scrapes the surface of the surface, of course. I learn new stuff every day.

  • by Hatta ( 162192 ) on Monday April 19, 2010 @07:43PM (#31904210) Journal

    Does `find . -print0 | xargs -0` really qualify as "serious hack magic"?

  • tools I found useful (Score:5, Informative)

    by tpwch ( 748980 ) <slashdot@tpwch.com> on Monday April 19, 2010 @07:48PM (#31904264) Homepage

    Here are some random things I find useful, related to user interaction (mostly becuase it notifies the user):

    Oven timer:
    sleep $((20*60)); xmessage "Dinner is done"

    Quick macro for automating some repetitive task in a program:
    xdotool type "something"; xdotool key Return; xdotool mousemove $x $y; xdotool click 1; (and so on)

    Copying a file to/from the clipboard (can also copy from /topipe, so the output of any command). Faster than opening a text editor:
    xclip -in file

    Notifying me when some specific thing changed on a website:
    CHECKLINE="$(curl -s http://somewebsite.org/somepage.html [somewebsite.org] | grep "currently undergoing maintenence")"
    while true; do
        sleep 120
        [ -z "$CHECKLINE" ] && xmessage "somewebsite is open again" && exit

    Or just checking for changes in general (I use this for notifying me when something changed when tracking something I ordered, so I know the minute the package is ready to get picked up at the post office):
    while true; do
        CONTENT=$(elinks -dump 1 -dump-charset iso-8859-1 "http://someurl.com/track?id=someid")
        MD5=$(echo -n $CONTENT | md5sum -)

        [ "${MD5}" != "${OLD_MD5}" ] && {
            xmessage "$(printf "New action: :\n\n${CONTENT}")"
        sleep 120

    If you don't want to interrupt what you're doing with a pop-up you can pipe it to osd_cat instead to have the text appear over whatever program you're currently working with. Adding a few beep; beep; beep; beep; is also a good way to get your attention if you're not paying 100% attention to your computer all the time.

  • by CFD339 ( 795926 ) <andrewp AT thenorth DOT com> on Monday April 19, 2010 @07:56PM (#31904350) Homepage Journal

    Linux Shell Scripting with Bash
    by Ken O. Burtch
    Sams Publishing

    One of only two "computer" books I've ever been able to just sit down and read rather than just using as reference (the other being Kathy Sierra's "Head First Java" -- which is amazing).

    Ken does a fantastic job at putting "just the right" level of background, detail, context, and and depth for someone new to shell scripting to get started, then to use the book as a reference for all the traditional tools (sed, awk, etc..).

    I've bought two copies, one for me and one I gave to someone else who wanted to learn how to do this stuff.

  • by DaleGlass ( 1068434 ) on Monday April 19, 2010 @08:04PM (#31904426) Homepage


    tar tf file.tar | xargs -d "\n" rm

    That will work unless the filenames contain newlines in them.

  • Re:Nice example. (Score:4, Informative)

    by hkz ( 1266066 ) on Monday April 19, 2010 @08:06PM (#31904450)

    Thanks, it's nothing I couldn't show a fella. Learnt a lot from my colleagues and from the O'Reilly 'Unix Power Tools' book. The Advanced Bash Shell-scripting Guide is pretty good (but chaotic) too.

    The syntax filter here munged some of the examples, though. The here document example will not work as-is, because there should be two 'less-than' signs in front of the minus sign. The mysql_query function probably also won't work (can't bother to run a test), because the newline after the first bracket mysteriously disappeared. So best to loop up the concepts in some kind of reference manual.

  • "Interaction" (Score:5, Informative)

    by betterunixthanunix ( 980855 ) on Monday April 19, 2010 @08:27PM (#31904612)
    The way I look at it is this: the "interaction" may actually be with another script. The whole abstraction that Unix-like OSes enforce, at least with file based IO, is that it is irrelevant what is on the other side of a file descriptor -- a disk, a pipe, a user, a socket, or something else entirely.

    Of course, this all starts to break down with GUIs.
  • ksh (Score:2, Informative)

    by Fatal Darkness ( 18549 ) on Monday April 19, 2010 @08:27PM (#31904616)
    For Unix shell scripting purposes (and I know the Slashdot crowd may scoff at this but), nothing compares to KSH. It has many features not found in Bash and most other shells, such as coprocesses, associative arrays, compound variables, floating point arithmetic, discipline functions, etc. It's also fully extensible and posix compliant. For GUI scripts, almost all commercial Unixes include dtksh [linuxjournal.com], which provides access to much of the Xt and Motif APIs. A TK version of ksh also exists.

    KSH just gets a bad rep because Unix vendors insist on only supplying an ancient version (ksh88), or its clone (pdksh) that lacks all of the functionality and behavior of the original. As a result most people have never used a modern version of the shell.

    Of there's a right tool for the right job. Depending on the nature of the task one might also want to consider perl, python, or some other scripting technology.
  • by actionbastard ( 1206160 ) on Monday April 19, 2010 @08:52PM (#31904792)
    The only bash scripting guide you will ever need:

    http://tldp.org/guides.html [tldp.org]

    free as in beer.
  • by Darinbob ( 1142669 ) on Monday April 19, 2010 @09:02PM (#31904906)
    Because people want to use illogical syntax like "copy this file that directory" and have the shell figure out if this is referring to 4 different file names, or 3, or 2, or 1. Well actually they want to do slightly more sane things like "copy $file $dir" form a script, while expecting the shell to be smart enough to realize that the spaces in a command line are delimiters while at the same time the spaces in a variable's contents are not delimiters.

    Some command languages handle ok (ie, DCL, DOS) because they have each individual command parse its own command lines or expand variables. But sh expands all arguments before passing them to the commands; which is a logical and consistent way of processing arguments and there's nothing broken about it. Making all command uniform makes for easier shell scripting. Sure it means that sometimes you have to do extra work when you've got spaces in file names, but that isn't the fault of 'sh' but the fault of whoever shoved spaces in the file names in the first place.

    Like any programming language so far, you're always going to bump into "do what I mean not what I wrote" problems.
  • by relinked ( 1613665 ) on Monday April 19, 2010 @09:10PM (#31904978)

    Here's a neat trick to access the output of commands as file handles:

    diff <( echo 'hello') <( echo 'world')

    Now that I've got your attention ;) I'll take this opportunity to plug my open source bash libraries:

    bash-script-lib [sourceforge.net], a collection of scripts that let you augment your own scripts with advanced capabilities:

    1. "script-input", which lets you create "cat"-like input handling that can accept both forms "my-script filename" and "cat filename | my-script".
    2. "script-targets", a framework for creating scripts that accept single or multiple "build-like" targets. You program just the targets; the framework takes care of the rest.
    3. "filesystem", a collection of functions for normalizing paths, checking the existence of directories, etc.
    4. "backups", a collection of functions for finding files, paths, and latest versions of files from amongst multiple tar files.
    5. "display", a collection of functions for tabulating output, converting end-of-line-delimited output into arrays, etc.

    bash-sys-manage [sourceforge.net], a collection of scripts that lets you manage VPS instances by installing components and backing up and restoring discrete aspects of a server. E.g.:

    install.sh system.apt system.locale system.users system.nginx nginx.config packages.utils.base packages.utils.build php.package php-fm.build apc.package memcache.package

    backup.sh system.users system.config mysql.database

  • by vtcodger ( 957785 ) on Monday April 19, 2010 @09:19PM (#31905046)

    ***The shell is a poor clone of 1950's algol.***

    "poor clone" seems entirely too generous.

    ***Today, scripting in Ruby or Python ...***

    Yes. I use Python myself for any script longer than a couple of lines. But interfacing Python to the unix OS is messier than it looks on the surface. For example, if your script hangs around for days and launches a lot of processes you will find that there are armies of "defunct" Zombie processes cluttering up the system. They don't go away until the script is killed. That's fixable, but figuring out how to fix it is an adventure for those of us who are not OS experts.

  • by vgaphil ( 449000 ) on Monday April 19, 2010 @10:53PM (#31905790)

    You can manipulate the IFS variable if you are having problems with files with spaces.


    for i in $(find /some/dir -type f)
    echo $i

    unset IFS

  • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Monday April 19, 2010 @11:49PM (#31906150) Homepage Journal
    This is actually the fault of the underlying operating system, not Python. The zombies are hanging around so that you can get their exit status. The problem is that this uses more than the few bytes necessary, and crowds up the process tables.

    I haven't looked at how Rails 1.9 garbage-collects a thread, but it doesn't look as if you have to join it to make it go away. 1.8 did not use OS threads, but just switched tasks when I/O blocked.

  • Re:Handling spaces (Score:3, Informative)

    by Qzukk ( 229616 ) on Tuesday April 20, 2010 @12:17AM (#31906318) Journal

    something like "for file in *" in bash will ignore special characters and run the loop once, for each actual file.

    bash's for loop understands * as a special case. if you need something like "for file in $(find ...);" you'll get one loop per word again. Also, even when you get one loop per file, you still have to quote $file when you use it because bash parses arguments to the command after variable substitution, so something like touch $file when $file is foo bar becomes touch foo bar where foo and bar are separate arguments, rather than what most people would expect (that the value of $file would be passed to the command as a single argument)

    Removing space from $IFS (the "standard" way is to make it tab and newline: IFS=$(echo -en "\n\b") ) fixes many of these quirks.

  • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Tuesday April 20, 2010 @01:59AM (#31906854) Homepage Journal
    It might be that Matz doesn't have the right hardware? Unfortunately running Electric Fence on the Ruby interpreter won't work for any significant program, too many allocations at one live page and one dead page per allocation, it fills up swap, thrashes the cache because all allocations are aligned, and thus it's too slow. But valgrind would probably work. If it still exists in the 1.9.2 snapshot it's worth doing that.
  • Re:None! (Score:1, Informative)

    by Anonymous Coward on Tuesday April 20, 2010 @03:56AM (#31907298)
    This is too awful. If the cd command fails, the rm will remove the current directory. Arg ! This style of writing gives me shivers down my back.

    At my work I struggled to explain that doing tar and gzip on two line is very bad practice to produce tar.gz files.
  • by Anonymous Coward on Tuesday April 20, 2010 @05:11AM (#31907602)
    You can also use "sleep 30m".
  • by Anonymous Coward on Tuesday April 20, 2010 @07:34AM (#31908162)

    Your problem is: thinking find for simple cases :)

    rm -f *.{temp,tmp,junk}

  • Use:

    tar -t --null -f file.tar | xargs -0 rm

    That will work unless the filenames contain nulls in them. They won't.

No extensible language will be universal. -- T. Cheatham