Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
GNU is Not Unix Programming Unix Linux

Adding Some Spice To *nix Shell Scripts 411

An anonymous reader writes "Developing GUI script-based applications is time-consuming and expensive. Most Unix-based scripts run in a CLI mode or over a secure ssh session. The Unix shells are quite sophisticated programming languages in their own right: they are easy to design and quick to build, but they are not user-friendly in the same way the Unix commands aren't (see the Unix haters books). Both Unix and bash provide features for writing user friendly scripts using various tools to build powerful, interactive, user-friendly scripts that run under the bash shell on Linux or Unix. What tools do you use that spice up your scripts on the Linux or Unix platforms?"
This discussion has been archived. No new comments can be posted.

Adding Some Spice To *nix Shell Scripts

Comments Filter:
  • Pashua on OS X (Score:3, Informative)

    by iliketrash ( 624051 ) on Monday April 19, 2010 @06:19PM (#31903940)

    On OS X, I use Pashua, http://www.bluem.net/en/mac/pashua/ [bluem.net]. This is a brilliantly simple thing to use. I also use it for other (non-script) languages for making a quick-and-dirty GUI that still looks nice and is a real Cocoa program.

  • None! (Score:5, Insightful)

    by Anrego ( 830717 ) * on Monday April 19, 2010 @06:19PM (#31903944)

    I know this is troll-ish, but the way I view it a script is just that.. a script. A series of commands to be executed in a specific order designed to automate a repetative task. Basic logic, control, and input are generally ok.. but interaction is in my opinion an indicator that your task is out of scope for a "script" and should become a full fledged application.

    (you may now freely argue amongst yourselves on the difference between a script and an application)

    There are a metric ass-tonne of dialog-type apps out there .. just google for your favorite toolkits prefix and "dialog" and you'll probably find something..

    gdialog
    kdialog
    xdialog
    etc..

    • I don't know if I agree or disagree. I look inside my init.d and see complicated yet thoroughly worked out logic. I see interaction coming from the other moving parts with conditional statements. I consider these startup applications even if they are really just scripts. Gotta love /bin/bash though.
    • Re: (Score:3, Interesting)

      by CAIMLAS ( 41445 )

      Often, I find myself writing scripts dealing with tasks which are semi-automated: I need a couple variables of input to deal with variance, but for the most part it's a repetitive task.

      What you're referring to is a batch script; that's good and fine, and I need those two. But that doesn't mean that interaction delineates a "script" from an "application".

      Though, I agree on one thing: a script is a script. There's no need to throw a dialog on there unless it in some way cleans up your input/output code and/or

    • Re: (Score:3, Interesting)

      by grcumb ( 781340 )

      I know this is troll-ish, but the way I view it a script is just that.. a script. A series of commands to be executed in a specific order designed to automate a repetative task. Basic logic, control, and input are generally ok.. but interaction is in my opinion an indicator that your task is out of scope for a "script" and should become a full fledged application.

      Well, there's interaction and then there's interaction. The grey area between script and application might be larger than your instinct tells you.

    • "Interaction" (Score:5, Informative)

      by betterunixthanunix ( 980855 ) on Monday April 19, 2010 @07:27PM (#31904612)
      The way I look at it is this: the "interaction" may actually be with another script. The whole abstraction that Unix-like OSes enforce, at least with file based IO, is that it is irrelevant what is on the other side of a file descriptor -- a disk, a pipe, a user, a socket, or something else entirely.

      Of course, this all starts to break down with GUIs.
    • Blur! (Score:3, Interesting)

      by flajann ( 658201 )
      The line between what is a "script" and what is an "application" has been blurred. Ruby, Python, and PHP are all "scripting languages" and yet many, many killer applications are written in them (like Blender, for example!).

      Bash scripting definitely has everything you'd need to write an "application" in, but many data constructs would be awkward to implement in Bash, so you'd use Python, Perl, or Ruby.

      But what I can do in Bash I could also do in Ruby or Python very easily. What I could do in Ruby or Pyt

  • by DarkKnightRadick ( 268025 ) <the_spoon.geo@yahoo.com> on Monday April 19, 2010 @06:19PM (#31903946) Homepage Journal

    Limiting yourself much? Also *nix != bash and bash != *nix though I imagine all shells share a host of similar commands.

    • Just as long as you're not scripting in a c shell you should be relatively OK. The scripts I see tend to be in either Bash or Bourne, but I'm sure there's people that insist on using the Korn shell and other ones. The main problem is that when you get too far from the standard shells you run an increased risk that the shell has to be installed. Bash for instance isn't installed by default on FreeBSD, but Bourne is.

      It's not just the built ins it's how they're used, you don't want to be caught scripting in
  • sed, awk, grep, strings, lots of pipes, and randomly useful scripts made from them stuffed into my ~/bin folder...

  • None, I have given up bash scripting. The syntax and semantic are simply to wierd. And it can't handle filenames with space in them without some serious hack magic.

    Maybe its time someone (re)-invent a total new shell, with a sane scripting language, commands with consistent names for the same arguments and in general something which don't feel like I live in 1980.

    And I am a fulltime linux used, and a software developer, and I do use the shell as an interactive interface, but I newer script it, and I always

    • Re: (Score:2, Funny)

      by Anonymous Coward

      it can't handle filenames with space in them without some serious hack magic.

      Use quotes

      • Re: (Score:3, Funny)

        by Anonymous Coward

        That's what novice "experts" usually say to do. Then you end up getting filenames that contain quotes. So what started as just escaping spaces turns into escaping spaces and two types of quotes. Depending on the approach you use here, you may need to escape some other characters, just in order to escape quotes, just so you can escape spaces, just so you can deal with filenames containing spaces.

        Any sensible person would say "fuck it" and just use a real scripting language like Python, Ruby or Perl.

        • Bash's strength lies not in it's awkward syntax and idiosyncrasies, it lies in the ease with which bash passes arguments to other programs and grabs the return values. In a few lines bash can call precisely the tools needed and string together some very powerful tools. Sure, similar things exist in python and can be accessed easily. but you can't eek out the same flexibility or succinctness as you can with bash for many tasks.

          Unsurprisingly, the place where the *nix collection of tools shines in text manipu

      • Re: (Score:3, Insightful)

        by Dadoo ( 899435 )

        Use quotes

        Ummm... yeah. Try "tar tf file.tar | xargs rm", when some of the files in the archive contain spaces (or other shell special characters).

        • by DaleGlass ( 1068434 ) on Monday April 19, 2010 @07:04PM (#31904426) Homepage

          Use:

          tar tf file.tar | xargs -d "\n" rm

          That will work unless the filenames contain newlines in them.

          • by Dadoo ( 899435 )

            Doggone it! You'd think, after 25 years of Unix, I'd know about that option. That's a good one. Thanks.

            I guess I'll have to read the man pages a little more often.

          • I think that just illustrate the point that Bash have nice features but horrible syntax and magic.

          • Re: (Score:3, Informative)

            by Just Some Guy ( 3352 )

            Use:

            tar -t --null -f file.tar | xargs -0 rm

            That will work unless the filenames contain nulls in them. They won't.

        • by hkz ( 1266066 )

          tar tf file.tar | while read file; do rm "$file"; done

          This works because the entire line is read into $file, spaces and all.

        • Use quotes

          Ummm... yeah. Try "tar tf file.tar | xargs rm", when some of the files in the archive contain spaces (or other shell special characters).

          In your example, I'd use a read loop to treat each line of of output as a single element. Handle each line using double quotes.

          In most cases, an array works fine for preserving special characters in file-names.

          mkdir /tmp/test
          cd /tmp/test
          touch 'foo'
          touch 'bar baz'
          declare -a FOO
          FOO=(*)

          Foo now contains an array. Each element is a single file-name, that preserves shell special characters.

          Foo has 2 elements

          $ echo ${#FOO[@]}
          2

          Those elements are:

          $ for ELEMENT in "${FOO[@]}"; do echo $ELEMENT; done
          foo
          bar baz

          Elements

          • by Dadoo ( 899435 )

            As far as I know, this approach to handling files has been supported for a long time

            I knew you could handle files that way, but I didn't know it preserved special characters.

            One thing I will say: bash is generally better about this than ksh, on real Unix systems. If I remember correctly, something like "for file in *" in bash will ignore special characters and run the loop once, for each actual file. Ksh, on the other hand, will pay attention to special characters.

            • Re: (Score:3, Informative)

              by Qzukk ( 229616 )

              something like "for file in *" in bash will ignore special characters and run the loop once, for each actual file.

              bash's for loop understands * as a special case. if you need something like "for file in $(find ...);" you'll get one loop per word again. Also, even when you get one loop per file, you still have to quote $file when you use it because bash parses arguments to the command after variable substitution, so something like touch $file when $file is foo bar becomes touch foo bar where foo and bar ar

    • by Hatta ( 162192 ) on Monday April 19, 2010 @06:43PM (#31904210) Journal

      Does `find . -print0 | xargs -0` really qualify as "serious hack magic"?

    • What's the problem with File\ Name.txt?

      • The problems occur when you start writing scripts, and start passing filenames around in pipes and whatnot. Generally speaking there are ways to avoid doing that, but it's not uncommon to realize that you accidental put yourself in that sort of a situation.

      • Re: (Score:3, Informative)

        by Darinbob ( 1142669 )
        Because people want to use illogical syntax like "copy this file that directory" and have the shell figure out if this is referring to 4 different file names, or 3, or 2, or 1. Well actually they want to do slightly more sane things like "copy $file $dir" form a script, while expecting the shell to be smart enough to realize that the spaces in a command line are delimiters while at the same time the spaces in a variable's contents are not delimiters.

        Some command languages handle ok (ie, DCL, DOS) because t
    • Re: (Score:3, Funny)

      by ZeBam.com ( 1790466 )
      Use Perl
    • by CAIMLAS ( 41445 ) on Monday April 19, 2010 @07:16PM (#31904534)

      You mean something like perl? Or maybe python?

      My vote is for perl. It's more common in a "base install" than any other shell (in the BSDs and most Linux distros) and has a non-trivial amount of power. It's good at dealing with path and input permutations and you can interface it with pretty much anything. Hell, pcre came from perl, and that's used almost everywhere these days: it's got a lot of things right for the little that's wrong, at least in terms of being a good scripting language.

      I avoid "shell" scripting (csh, sh, bash) if at all possible, too. The contortions necessary to do the frequently-necessary evaluations takes quite a bit longer, even with a chain of awk/sed/grep and the like. Unlike those languages, perl is entirely self-contained and does not have any system-specific oddities (eg. with a shell script, many system binaries are different and an option/parameter pair on one system might do something entirely different on another - or not work at all).

      I realize perl can often (usually) be difficult to read. But for my purposes, it's good enough, because I'm a bit of a prolific comment writer as a matter of process.

    • I've felt that sh (bash or ksh) is one of the more sane scripting languages out there, for what it does. Granted it's not a full programming language in many respects. Most of the oddities come from it being specially a scripting language for "commands" as well as the interactive command processor in one. Ie, Perl may be more powerful, but it would be inconvenient to use Perl as your CLI. There's a big difference between a programming language that lets you run a command as an afterthought and a command
  • I really can't praise Gtk2-Perl enough. Using Glade to quickly build your GUI, and Perl to quickly build your logic, it's a knock-out combination. The end result looks just like a gnome application ( using Gtk2 ), and for 95% of cases, runs as fast as well. I liked it so much, I wrote some database classes, Gtk2::Ex::DBI and Gtk2::Ex::Datasheet::DBI ... see: http://entropy.homelinux.org/axis/ [homelinux.org].

    • by grcumb ( 781340 )

      I really can't praise Gtk2-Perl enough. Using Glade to quickly build your GUI, and Perl to quickly build your logic, it's a knock-out combination.

      Heh, I'll see your GUI and raise you transparency. I've got a little dashboard applet that uses X11::Aosd to display a translucent status display for all my key servers. Yes, I know I just re-invented Conky, but because it's Perl I can use SSH::RPC in the back end to securely talk to my servers in order to get quick and dirty performance metrics.

      One of the things scripting does well is to chop tasks into small, manageable steps. While system monitoring is a complex and demanding process, all I really need o

  • by Fluffeh ( 1273756 ) on Monday April 19, 2010 @06:30PM (#31904066)
    I always request the budget for a small unit of scantily clad maidens from management in addition to the team beer budget.

    One day.... one day....
  • Comment removed based on user account deletion
  • by hkz ( 1266066 ) on Monday April 19, 2010 @06:37PM (#31904130)

    I work as a Linux netadmin and system developer, so I do a lot of shell programming in (ba)sh. Here's some of the niftier things you can do to a shell script:

    - Make colored output with shell escape sequences. Works especially well with red type for error messages, makes them really stand out.
    - Use inline functions a lot. The great thing about them is that you can pipe to them and use them in all kinds of places. For instance, to do mysql queries:

    mysql_query() { /usr/bin/mysql --user=root --pass=topsecret database
    }

    echo 'SELECT * FROM accounts' | mysql_query

    - "Here documents". For long MySQL sequences, something like the following (reusing the mysql_query function from above):

    cat - EOF | mysql_query
          SELECT bar
          FROM foo
          WHERE baz ...
    EOF

    This lets you easily format stdin for scripts and other programs. Also really useful for outputting HTML and stuff like that. Best thing is that variables are expanded inside the block.

    - The || and && operators. Check if a file exists, remove if so, else complain:
    [ -f /tmp/somefile.txt ] && rm /tmp/somefile.txt || echo "Does not exist!"

    Also common in this form:
    [ -x /usr/bin/necessaryprogram ] || { echo "aaargh"; exit 1; }

    - Making a "multithreaded" shellscript is also one of my favourites. Say, you want to start five virtual machines at the same time. Write a function that starts a vm, and call it a few times in a loop, backgrounding each instance with &, and saving their PIDs. Then have a "wait loop" that waits for the PIDs to exit the system (or for a timeout to occur).

    - Proper argument handling with getopt. Have your script take "real" arguments in any order, just like real binaries.

    This just scrapes the surface of the surface, of course. I learn new stuff every day.

    • I do a fair bit of shell scripting, but your script-fu is better than mine. I've just printed your sample go over again later with my copy of Ken Burch's book in hand to make sure I understand every nuance. I don't think I've EVERY printed anything from /. before.

      • Re:Nice example. (Score:4, Informative)

        by hkz ( 1266066 ) on Monday April 19, 2010 @07:06PM (#31904450)

        Thanks, it's nothing I couldn't show a fella. Learnt a lot from my colleagues and from the O'Reilly 'Unix Power Tools' book. The Advanced Bash Shell-scripting Guide is pretty good (but chaotic) too.

        The syntax filter here munged some of the examples, though. The here document example will not work as-is, because there should be two 'less-than' signs in front of the minus sign. The mysql_query function probably also won't work (can't bother to run a test), because the newline after the first bracket mysteriously disappeared. So best to loop up the concepts in some kind of reference manual.

        • by hduff ( 570443 )

          Thanks, it's nothing I couldn't show a fella. Learnt a lot from my colleagues and from the O'Reilly 'Unix Power Tools' book. The Advanced Bash Shell-scripting Guide is pretty good (but chaotic) too.

          Wow. Great attitude. I hope Mr. Just-Error-Out reads your posts.

  • by the_humeister ( 922869 ) on Monday April 19, 2010 @06:37PM (#31904134)

    I use the usual: sed, [, wget, etc to automate downloads of pr0n.

  • is not a GUI?

  • I dissent (Score:2, Insightful)

    by ZeBam.com ( 1790466 )
    I disagree with the premise that GUI interfaces are needed, desirable, or constitute "spicing up." Command line scripts are fine and dandy in my book.
    • by Dadoo ( 899435 )

      Command line scripts are fine and dandy in my book

      I'm inclined to agree, given that I've written several dynamic web pages in shell.

    • Re:I dissent (Score:4, Insightful)

      by martin-boundary ( 547041 ) on Monday April 19, 2010 @07:02PM (#31904412)
      It's also plain short-sighted to program a script for a GUI.

      It implies that the script will only be run by human users (and probably, human users who happen to run a particular flavour of GUI). Traditional shell scripts are written for all users, not just human users.

      Why should developers care about non-human users? It's what makes automation possible. Every time time a script delegates work to another script, that's a non-human user scenario.

      If you build enough scripts that can be used by all users, then you have a critical mass and your system becomes really powerful. If you build enough scripts that can only be used by human users, then your system stays weak, for it is limited by the actions of a single human operator.

  • Stop using the Shell (Score:5, Interesting)

    by Bruce Perens ( 3872 ) * <bruce@perens.com> on Monday April 19, 2010 @06:38PM (#31904158) Homepage Journal
    The shell is a poor clone of 1950's algol. Today, scripting in Ruby or Python yields scripts that can handle errors with advanced facilities like exceptions, and is more maintainable, and can connect to a number of different GUIs or the web.
    • by sribe ( 304414 )

      The shell is a poor clone of 1950's algol. Today, scripting in Ruby or Python...

      It was a great day when I decided no more bash scripts, ever. Ruby is so much nicer...

    • Re: (Score:2, Interesting)

      by walkoff ( 1562019 )
      There are times when bash/ash/dash... are all that is available or can be made available. Ruby and Python and the myriad other scripting languages are all very good but in memory, cpu and diskspace starved devices shell is the way to go. even the mini-versions of perl and Ruby etc. are a PITA to get working in embedded devices especially if you are using uClibc.
    • by martin-boundary ( 547041 ) on Monday April 19, 2010 @08:38PM (#31905220)
      Sorry, those advanced facilities are overhyped. How do you (eg) pipe commands together in Ruby, Python or Perl?

      AFAIK(?), none of these languages come close to the simple expressivity of cmd1 | cmd2 | cmd3 > file1.

      The shell's purpose has always been to serve the user. From the perspective of a user, advanced programming facilities like exceptions are not just useless, but can seriously get in the way.

      Programmers write pages and pages of code, and they appreciate class hierarchies, vector operations, etc. Users write throwaway scripts that are run once or twice.

      Programmers like powerful languages that make maintenance easy. Users like powerful shells that make simple interactions really easy.

      • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Monday April 19, 2010 @10:54PM (#31906178) Homepage Journal

        none of these languages come close to the simple expressivity of cmd1 | cmd2 | cmd3 > file1.

        This is sort of like saying that no language has anything that lets you align text like FILLER PICTURE in old Fortran. Sure, but you don't need to do it. I don't ever have to pipe to sed, because I can do File("foo.txt").read.gsub(/^Foo.*Bar$/, 'Hello!') and get the same result.

        • by martin-boundary ( 547041 ) on Monday April 19, 2010 @11:52PM (#31906498)
          You're comparing end results, not interfaces.

          Sure you can get the same result, but the syntactic sugar in your example is much more verbose, and conceptually more complex.

          For each of the three components, there's a mental context switch (File object on the left, reader object in the middle, and substitution method on the right).

          The shell language does the right thing by handling components more uniformly (ie they all have STDIN/STDOUT regardless of the nature of the command). The user needs to know what each command will do, but he does not need to know if the result is an array object, or a stream ojbect, or a file object etc.

          The shell also has less redundancy. Compare cat foo.txt with File("foo.txt"), there should be no need for both parentheses and quotes. Now in the wider scheme of Ruby this redundancy makes perfect sense, but users don't need all this, only programmers do.

          Users need the bare minimum to communicate with the machine in a language that takes 30 seconds or less to type (or speak in a microphone...), but still lets them do as much as possible.

          It's an interface issue, it's got little to do with the range of things that can be done in the language. Ruby is much more powerful than bash, but bash is still better at starting and stopping programs (and rc is better than bash...).

    • by bzipitidoo ( 647217 ) <bzipitidoo@yahoo.com> on Monday April 19, 2010 @09:21PM (#31905548) Journal

      I use sh and relatives (and vi) because they're ubiquitous, stable, small, light, and reasonably fast, consistent, capable, and fairly understandable. Every program in /etc is a shell script, and by default system utilities such as cron call on sh. Everything entered at a command line is interpreted by sh. sh is as much a part of UNIX systems as C. You might as well suggest GNU/Linux be rewritten in a better language than C.

      And if you're going to suggest that, why not also reexamine the basic architecture of UNIX? If anyone produces an open, formally verified microkernel OS in Haskell that actually works, isn't dog slow, and has sufficient functionality and apps to be useful, I'll surely check it out. I'd love to see more consistency between how applications accept parameters from the command line and how programming languages handle parameters. The former tends to be named and unordered, while the latter is anonymous and ordered. Then there's the defacto standard for libraries, worked out in the days when memory and disk space was extremely limited. It doesn't support enough meta information, making it necessary for a compiler to read header files. It's made libraries many little worlds of their own. As long as a programmer sticks to C/C++, it is relatively easy to call C library functions, but step outside that and it becomes a huge pain. Therefore we have these monstrous collections of duplicate functionality and wrapper code such as CPAN, abominations such as SWIG, attempts to bridge things by providing some commonality and standardization such as CORBA, and separate worlds such as the gigantic collection of Java libraries.

      Something like Perl or Java is heavy enough to be impractical on a slow computer with little RAM. Can take over 5 seconds just to load the language. I'm not familiar enough with Python or Ruby to know if they're as heavy. You can't always be sure they're there, whereas whatever was used in /etc/rc.d, and is run in a terminal, is guaranteed to be present. Don't know about a "pysh", but there is a "perlsh", for use in a terminal. Never seen perlsh used though, and it seems to demand a nasty hackish sort of interaction. Press Enter twice to execute commands, as one press of Enter is apparently used as a statement or formatting break. Maybe that's because those languages actually aren't too suitable for an interactive environment? As to connecting to the web, there's wget, wput, and curl.

      It could be a lot worse. Bash is pretty nice compared to MS DOS batch language.

  • 3D goggles! (Score:3, Funny)

    by Anonymous Coward on Monday April 19, 2010 @06:39PM (#31904170)

    Combined with red and blue text the goggles make my facepalm ASCII art really pop!

    You see, I use ASCII art in lieu of the dialog boxes for user feedback. It's more intuitive to show facepalm guy when I ask the user for a digit & they give me a letter. They understand right away that they're an idiot.

  • Why (Score:5, Insightful)

    by silas_moeckel ( 234313 ) <silas@ds[ ]c-corp.com ['min' in gap]> on Monday April 19, 2010 @06:48PM (#31904260) Homepage

    The CLI is powerful because it's a CLI, you do not need or want pretty dialog boxes. Help is whats available with man --help usefull errors messages and the contents of /var/log. It works over 9600 baud serial and works pretty well so you can ssh from your smartphone with 1 bar and fix something at 3am before the GUI would have time to come up to a login screen. A good CLI expects things to be piped into and out of it and can get any required information via the command line. The power of the CLI is that you can chain bits together run to do things or wrap scripts around other scripts and do useful work.

    You point to a 20 year old book that mostly bitches about how slow/ugly X is, guess what things have come a long way, I run one laptop with native X and it looks good is responsive I export X all the time over ssh to my primary desktops. Take a step back and think why your trying to shoehorn GUI functions onto a CLI if you really need to do it look at some of the toolkits that can detect if there is a X server present and use that fallback to text gui and run entirely headless by pure command line but think long and hard about why you would want to do this.

  • tools I found useful (Score:5, Informative)

    by tpwch ( 748980 ) <slashdot@tpwch.com> on Monday April 19, 2010 @06:48PM (#31904264) Homepage

    Here are some random things I find useful, related to user interaction (mostly becuase it notifies the user):

    Oven timer:
    sleep $((20*60)); xmessage "Dinner is done"

    Quick macro for automating some repetitive task in a program:
    xdotool type "something"; xdotool key Return; xdotool mousemove $x $y; xdotool click 1; (and so on)

    Copying a file to/from the clipboard (can also copy from /topipe, so the output of any command). Faster than opening a text editor:
    xclip -in file

    Notifying me when some specific thing changed on a website:
    CHECKLINE="$(curl -s http://somewebsite.org/somepage.html [somewebsite.org] | grep "currently undergoing maintenence")"
    while true; do
        sleep 120
        [ -z "$CHECKLINE" ] && xmessage "somewebsite is open again" && exit
    done

    Or just checking for changes in general (I use this for notifying me when something changed when tracking something I ordered, so I know the minute the package is ready to get picked up at the post office):
    while true; do
        OLD_MD5=${MD5}
        CONTENT=$(elinks -dump 1 -dump-charset iso-8859-1 "http://someurl.com/track?id=someid")
        MD5=$(echo -n $CONTENT | md5sum -)

        [ "${MD5}" != "${OLD_MD5}" ] && {
            xmessage "$(printf "New action: :\n\n${CONTENT}")"
        }
        sleep 120
    done

    If you don't want to interrupt what you're doing with a pop-up you can pipe it to osd_cat instead to have the text appear over whatever program you're currently working with. Adding a few beep; beep; beep; beep; is also a good way to get your attention if you're not paying 100% attention to your computer all the time.

  • by CFD339 ( 795926 ) <andrewp&thenorth,com> on Monday April 19, 2010 @06:56PM (#31904350) Homepage Journal

    Linux Shell Scripting with Bash
    by Ken O. Burtch
    Sams Publishing

    One of only two "computer" books I've ever been able to just sit down and read rather than just using as reference (the other being Kathy Sierra's "Head First Java" -- which is amazing).

    Ken does a fantastic job at putting "just the right" level of background, detail, context, and and depth for someone new to shell scripting to get started, then to use the book as a reference for all the traditional tools (sed, awk, etc..).

    I've bought two copies, one for me and one I gave to someone else who wanted to learn how to do this stuff.

  • by Anonymous Coward

    I know this is like cursing in the church, but I use VB for most tasks others would use shell scripts for. Why? For one, the syntax is more predictable. With Bash you always have to worry about special characters and I can't stand that. (Same reason I dislike Tex.) Secondly, if you need user interaction, it has a really easy to use GUI builder. When VB4 came out it was like 1995 or something. It is now 2010 and in my opinion, for building simple dialogues (or even not so simple ones) VB is still among the b

    • Re: (Score:3, Insightful)

      by siride ( 974284 )
      Please, for the love of $DEITY, learn Perl or Python or Ruby or SOMETHING. VB's syntax is not predictable or reasonable if you've programmed with any other language or know how a computer works. And the other languages are actually cross-platform and can do everything VB can do and then some.
  • Enough said...
  • One of my recent discoveries has been using Groovy to add UI effects to scripts, via the Java libraries it has access to. For example, if a script completes, it's really easy to add a notification to the system tray:

    def image = Toolkit.defaultToolkit.getImage("some_image.png")
    def trayIcon = new TrayIcon(image,"Script")
    SystemTray.systemTray.add(trayIcon)
    trayIcon.displayMessage("Script Completed", "", TrayIcon.MessageType.INFO)
  • ... not user friendly? Its just picky about who it makes friends with.
  • tcl/tk (Score:3, Interesting)

    by drolli ( 522659 ) on Monday April 19, 2010 @07:19PM (#31904554) Journal

    Honestly it its just about adding a button so that its not necessary to remember the command line arguments/switches, i prefer tcl/tk. Lightweight, portable (and ported), and stable. And if you need a little more functionality, there are tons of libraries available.

  • I think you should e-mail that guy that wanted to manage his Windows desktops by running Windows images on top of Linux, using some virtualization technology, but also passing through the hardware capabilities of the video cards, so he could run multiple monitors.

    I bet he'd have some great tips about how to spice up your shell scripts. He probably edits them in emacs (running inside an AJAX terminal inside his web browser, connected to a web server on the machine he's editing the file on).

  • ksh (Score:2, Informative)

    For Unix shell scripting purposes (and I know the Slashdot crowd may scoff at this but), nothing compares to KSH. It has many features not found in Bash and most other shells, such as coprocesses, associative arrays, compound variables, floating point arithmetic, discipline functions, etc. It's also fully extensible and posix compliant. For GUI scripts, almost all commercial Unixes include dtksh [linuxjournal.com], which provides access to much of the Xt and Motif APIs. A TK version of ksh also exists.

    KSH just gets a b
  • Nice little gem here though.
    http://www.linuxfocus.org/English/May2004/article335.shtml [linuxfocus.org]

    Be warned though - unix *likes* being ugly. non-ansi terminals quickly fill with garbage when ansi escape codes are printed to them. The same problem is with using purty' X dialog shells. If you don't have the terminal support, or X session, or X libraries installed, your script becomes useless in a hurry.

  • > What tools do you use that spice up your scripts on the Linux or Unix platforms?

    sh -x

  • by bhepple ( 949572 ) on Monday April 19, 2010 @07:38PM (#31904686)
    As said previously, scripts are scripts and don't often need a GUI. But for grep's sake, make them consistent!!! The only spicing up _really_ needed are some standards:

    o output errors to STDERR; normal output to STDOUT
    o include (-h, --help) processing - and send it to STDOUT so the help can be piped to 'less'
    o use getopt(1) or process-getopt(1) so that options on the CLI parse in a predictable and standard way
    o keep it terse except for errors so that the user can easily see if it worked or not without scanning vast output
    o provide a --verbose option to help with tracking down those errors

    ... and the most annoying thing of all - make sure --help _always_ works, even if the script body itself can't - at least the user can then be told about what the prerequisites are.
    Head over to http://mywiki.wooledge.org/BashFAQ [wooledge.org] for much wisdom on how to write better bash scripts.
  • I'm not really sure what the point of this item is, but I'll be more than happy to blather on about console stuff I like :P

    Midnight Commander: (mc) - Mostly I write scripts to make things easier for other people. But there isn't always a good GUI to allow the user to see what scrips are available. So I'll get all my scripts together in a directory and open the directory in midnight commander. Then they can see a sorted list of all the available commands. Gnome panel buttons work OK for this as well, bu

  • by steveha ( 103154 ) on Monday April 19, 2010 @08:21PM (#31905078) Homepage

    I will quickly write a shell script any time I have some simple task I want to automate. You cannot beat the convenience:

    cd /some/directory/$1
    some_program --foo $2 --bar $3
    rm -f *.temp

    Wow, three lines, and it runs the program, then cleans up the temp files that program always litters in my directory. And I don't have to memorize the --foo and --bar options! Shell scripts rock!

    The problem comes when you start to do nontrivial things. When you start processing lists of files, and the files can contain spaces, the amount of quoting drives me insane. At that point I rewrite in Python.

    The spaces-in-file-names problem can bite even this trivial shell script! If any of the three arguments ($1, $2, $3) is specified as a string containing spaces, this script won't work, because the shell interpreter needs quotes at every step where it evaluates something. If you pass "my file.txt" as the second argument, the $2 won't evaluate to "my file.txt" in quotes, it just evaluates to the bare string. So to be fully safe, the above program needs to be:

    cd /some/directory/"$1"
    some_program --foo "$2" --bar "$3"
    rm -f *.temp

    And woe is you if you forget the quotes.

    Python loses in convenience for running a program... here's a Python equivalent of the above:

    import os
    import subprocess as sp
    import sys

    os.chdir("/some/directory/%s" % sys.argv[1])

    lst_args = ["some_program", "--foo", sys.argv[2], "--bar", sys.argv[3]]
    sp.check_call(lst_args)

    lst_args = ["rm", "-f", "*.temp"]
    sp.check_call(lst_args, shell=True) # run in a shell to get wildcard expansion

    At first glance this looks horrible. It's much more than the three terse lines of the original. But it's easier to get right, and this is safer to run. If the user specifies something silly for the first arg, or doesn't provide it, this program will immediately stop after trying to change directories. The original would change to "/some/directory" and blindly run on, trying to run "some_program" there, and who knows what would happen? Likewise, if "some_program" fails, this script will stop immediately, and the deleting of the *.temp files will not occur (making it easier to debug what's going on). Finally, in this code we don't have to worry about quoting the arguments; we can just use the arguments and it just works. It is much harder to write a fail-safe shell script: you would have to explicitly test that $1 is provided, and you would have to check the result of running "some_program" to see if it failed or not.

    The nontrivial scripts I write tend to have a lot of logic in the scripts themselves, and Python is much much more pleasant and effective for evaluating the logic. If I want to write a script that sweeps through a bunch of directories and deletes files that match certain criteria, it is so much easier to write the tests on the file in Python. If I write ten lines of "if" statements to look at a filename, that is ten lines where I didn't need to fuss with the double quotes. In Python, you can do things like
    junk_extension = (".temp", ".tmp", ".junk")
    if filename.endswith(junk_extension):
            os.remove(filename)

    Shell scripting cannot match this convenience. And note that if I use the native Python os.remove() I don't need to worry about quoting the filename; it can have spaces in it and os.remove() doesn't care.

    Other people might prefer to use Perl or Ruby. Either of those, or Python, are much better than shell scripts for anything nontrivial.

    steveha

    • by OA ( 65410 ) on Monday April 19, 2010 @09:49PM (#31905756) Homepage

      I agree python is lots of fun... but I do not call following script to be nontrivial.

      > In Python, you can do things like
      > junk_extension = (".temp", ".tmp", ".junk")
      > if filename.endswith(junk_extension):
      > os.remove(filename)

      Your problem is: thinking ten lines of "if" statements to look at a filename.

      This kind of things are done in 1 liner single shell command. This is too simple to bother python.

      Please read about the "find" command. especially with --exec rm '{}' \;

      Osamu

      • Re: (Score:3, Insightful)

        by steveha ( 103154 )

        I do not call following script to be nontrivial.

        I don't think I said it was nontrivial; I just said that Python was more convenient. If you wanted to test a single file and see whether it ended with one of three extensions in a shell script, what would you do?

        You could do it this way, but it's painful and ugly:

        # shell variable "filename" holds the filename
        if [ "${filename#*.}" = "temp" ] || [ "${filename#*.}" = "tmp" ] || [ "${filename#*.}" = "junk" ]; then
        echo "$filename"
        fi

        Don

He who has but four and spends five has no need for a wallet.

Working...