Bash Sell

Command History
One of the most useful features of the Bash shell is the ability to recall past commands with a couple of keystrokes. Use this feature when you need to switch repeatedly between several long commands. There are several ways to do this in the Bash shell.

Arrow Key Method:
The simplest way to go through your command history is to use the up and down arrow keys on your keyboard. At the command prompt, simply press the up key to go to the last command you gave the Bash shell. If you press up again, it will go to the second most-recent command, and so on. If you press the down key (after having already used the up key), you scroll forward in your history to more recent commands.
When the desired command is displayed, press Enter to execute it.

Auto-complete Method:
Another way to use the Bash history is to begin to type the first part of a command and then press control (Ctrl) and r keys at the same time. When the text for the command that you have already entered uniquely identifies the command you want, Bash completes the rest of the command for you. For example, in your history list you have these three commands: cd directory/directory2, chmod 755 –R public_html, and cd directory3/directory. At the command prompt type ch and then Ctrl + r; the Bash shell automatically completes the command to chmod 755 –R public_html. If you wanted to reissue the command cd directory3/directory, you must type enough of the command for the shell to uniquely identify it; thus you would type cd directory3 before pressing Ctrl + r to fill in the rest of the command. This method is the most useful when you want to reissue a command from your history list that is not a recent command, so using the arrow key method would take some time.
When the desired command is displayed, press Enter to execute it.

Some Knowledge

^error^correction Method
One problem with a command line interface is that if you misspell, mistype, or use the wrong case when you enter a command, the command won’t work. Re-typing can be tedious if you are typing long commands. Use the ^error^correction method by putting the string of characters you wish to correct after the first “^” and what you want to change it to after the second “^”. For example, you type the command

   vi directory1/directory2/directory3/directory4/directory5/filename.txt
but what you want is
   vi Directory1/directory2/directory3/directory4/directory5/filename.txt

Instead of retyping the whole command to change “d” to “D”, simply type
Note that you must make what appears in the “^xxx^yyy” uniquely identifiable, else Bash will change only the first instance of that character string. Using the example above, you must enter ^directory1^Directory1 rather than ^dir^Dir because the entire command contains five instances of the character string dir.
When the desired command is displayed, press Enter to execute it.

Keyboard Shortcuts and Aliases
Knowing and using keyboard shortcuts when using a command line interface makes your work easier and faster. Use aliases for frequently used commands that are either long or have tags that you always add to them. Aliases are user-defined and are not available by default in Bash, but can easily be added at any time.

Bash Keyboard Shortcuts
Keys Action, Use
Ctrl + c Kill the current process or command.
Ctrl + w Erase the word or string of characters (define by a preceding and following space) immediately preceding the cursor.
Ctrl + u Erase everything between the beginning of the command you are currently typing up to the cursor.
Tab Shortcut for file/directory names. For example, you have a directory named “sample_directory1” and you want to move to that directory. Type cd s and then press the Tab key, which completes the command as cd sample_directory1. Note that if you have a directory named “sample_directory1” and another named “sample_folder1”, you must type enough characters for the shell to know which directory beginning with “s” you wish to open. In this case, you must type cd sample_d + Tab.
Ctrl + a
Move to the start of the line
Ctrl + e
Move to the end of the line
Alt + f
Move forward a word
Alt + b
Move backward a word
Ctrl + l
Clear the screen, reprinting the current line at the top
Ctrl + k
Cut (kill) the text from the current cursor position to the end of the line
Alt + d
Cut (kill) from the cursor to the end of the current word, or, if between words, to the end of the next word
Alt + DEL
Cut (kill) from the cursor the start of the current word, or, if between words, to the start of the previous word
Ctrl + w
Cut (kill) from the cursor to the previous whitespace
Ctrl + y
Paste(yank) the most recently cuted (killed) text back into the buffer at the cursor
Alt + a Rotate the Paste (kill-ring), and paste (yank) the new top. You can only do this if the prior command is Ctrl-y or Alt-y.

Aliases are a way of customizing your Bash shell environment. An alias takes an existing command or set of commands and makes them execute with a new command word that you create. For example, you often use this command to sort your files showing you which files and directories are the largest:
   du –a | sort –k 1n,1
(This command is useful for keeping track of used disk space.) So, create an alias for this command. Using the editor of choice (see “Opening Files” below) open the .bashrc file, which is the configuration file for your Bash shell environment. Add the following line to shorten the above command to sortfiles:
   alias sortfiles=’du –a | sort –k 1n,1’
Save the changes and exit the editor. The next time you start your Bash shell, your new alias will be in effect.

The format for adding an alias is:
   alias new_name = ’command
where new_name is your name for the alias and command is the command you wish to replace. Remember to surround the entire command with apostrophes (single quotes). Note that you cannot use spaces or existing commands in the alias name or new_name .

Writing UNIX Scripts

Comments and Commands

In the Bourne shell, any line beginning with a hash '#' character in the first column is taken to be a comment and is ignored. The only exception is the first line in the file, where the comment is used to indicate which shell should be used (/bin/sh is the Bourne shell). It is always good practice to use comments to indicate what a script does; both in a header section at the top, and line by line if the code is at all complicated. There aren't many comments in the examples in this document, to reduce the amount of paper used in printing it. Your scripts should be far more liberally commented.

Lines not preceded by a hash '#' character are taken to be Unix commands and are executed. Any Unix command can be used. For example, the script below displays the current directory name using the pwd command, and then lists the directory contents using the ls command.

# purpose: print out current directory name and contents

Shell Variables

Like every programming language, shells support variables. Shell variables may be assigned values, manipulated, and used. Some variables are automatically assigned for use by the shell.

The script below shows how to assign and use a variable. In general, shell variables are all treated as strings (i.e. bits of text). Shells are extremely fussy about getting the syntax exactly right; in the assignment there must be no space between the variable name and the equals sign, or the equals sign and the value. To use the variable, it is prefixed by a dollar '$' character.

# name is a variable
echo "The name is $name"

The special variables $1-$9 correspond to the arguments passed to the script when it is invoked. For example, if we rewrite the script above as shown below, calling the script name, and then invoke the command name Dave Smith, the message "Your name is Dave Smith" will be printed out:

echo "Your name is $1 $2"

Built-In Variables


Pattern Matching

Wildcards can be used to match multiple filenames, ie. command prefix.* will execute command on all files named based on prefix.*. Wildcards can be placed anywhere within a filename match string, as a prefix or a suffix. The * character will find multiple characters and the ? character will match a single character. Note that matching with Unix is case sensitive. Exact characters can be matched and even ranges of characters can be matched as in [0-9]. Matching can also be done as negation as in command [!<value>]*.

Substitution Based on Variable State

Command Substitution

Command substitution allows submission of the output of a command (STDOUT) into the input (STDIN) of another command. Command substitution is performed by placing the command between funky Unix quotes, ie. `command`. For instance, DATE=`date` will set the current system date into the variable DATE. Basically command substitution is the placing of the result of the execution of a command into a variable or into another process. Some examples are shown below.

Arithmetic Substitution

Arithmetic substitution allows the evaluation of an arithmetic expression from the command line prompt. Note that arithmetic substitution only works on K-Shell (/bin/ksh) and Bash (/bin/bash) using the syntax $((expression)). Allowed operators are /, *, -, + and () for precedence.

Conditionals in Shell Scripts (if and case)
Conditionals are used where an action is appropriate only under certain circumstances. The most frequently used conditional operator is the if-statement. For example, the shell below displays the contents of a file on the screen using cat, but lists the contents of a directory using ls.

# show script
if (condition1) then
elif (condition2) then

Here, we notice a number of points:

There are a number of conditions supported by shell scripts; for a complete list, use the on-line manual on the test command (man test). Some examples are: -d (is a directory?), -f (is a file?), = (are two strings the same?), -r (is string set?), -eq (are two numbers equal?), -gt (is first number greater than second?). You can also test whether a variable is set to anything, simply by enclosing it in quotes in the condition part of the if-statement. The script below gives an example:

# Script to check that the user enters one argument, "fred"
if [ "$1" ]
echo "Found an argument to this script"
if [ $1 = "fred" ]
echo "The argument was fred!"
echo "The argument was not fred!"
echo "This script needs one argument"

The if condition is suitable if a single possibility, or at most a small number of possibilities, are to be tested. However, it is often the case that we need to check the value of a variable against a number of possibilities. The case statement is used to handle this situation. The script below reacts differently, depending on which name is given to it as an argument.

case "$1" in
echo "Hi fred. Nice to see you"
echo "Oh! Its you, is it, joe?"
echo "Clear off!"
echo "Who are you?"

The case-statement compares the string given to it (in this case "$1", the first argument passed to the script) with the various strings, each of which is followed by a closing bracket. Once a match is found, the statements up to the double semi-colon (;;) are executed, and the case-statement ends. The asterix * character matches anything, so having this as the last case provides a default case handler (that is, what to do if none of the other cases are matched). The keywords are case, in and esac (end of case).


# join command - joins two files together to create a third
# Three parameters must be passed: two to join, the third to create
# If $3 doesn't exist, then the user can't have given all three
if [ "$3" ]
# this cat command will write out $1 and $2; the > operator redirects
# the output into the file $3 (otherwise it would appear on the screen)
cat $1 $2 > $3
echo "Need three parameters: two input and one output. Sorry."

# An alternative version of the join command
# This time we check that $# is exactly three. $# is a special
# variable which indicates how many parameters were given to
# the script by the user.
if [ $# -eq 3 ]
cat $1 $2 > $3
echo "Need exactly three parameters, sorry."

# checks whether a named file exists in a special directory (stored in
# the dir variable). If it does, prints out the top of the file using
# the head command.
# N.B. establish your own dir directory if you copy this!
if [ -f $dir/$1 ]
head $dir/$1


A common requirement of many programs is to compare two, three, or more things together. Strings and numbers may be compared. Files are often checked for their lengths and/or existence.

All such verifications are done using variants of the test command.

The general usage of test is

test expression

If expression is true, a return code of 0 is supplied.

If expression is false, a non-zero return code is generated.

Testing/Comparing Numbers

The primitives available for comparison of numeric values are

For example:

if test $# -le 5
echo Less than or equal to five parameters.
echo More than 5 parameters.
exit 0

Verifying File Types

To test file types, a number of primitives are used

-s    checks that the file exists and is not empty.
-f    checks that the file is an ordinary file (not a directory).
-d    checks whether the file is really a directory.
-x    checks that the file is executable.
-w    checks that the file is writeable.
-r    checks that the file is readable.

An example would be where a program needs to output something to a file, but first checks that the file exists:
    if test ! -s arg.file
        echo "arg.file is empty or does not exist."
        ls -l > arg.file
        echo "File arg.file already exists."
    exit 0

Note the exclamation mark within the test sequence. The exclamation mark means ``not".
Comparing Strings

String comparisons are done using = and !=:

if test $# -eq 0
echo Must provide parameters.
exit 1

while test ! $1 = "end"
echo parameter is $1
if test $# -eq 0
echo Parameter list MUST contain the '"'end'"' string.
echo Done: I"'"ve hit the '"'end'"' string.
exit 0

Note that the above example could have been MUCH shorter if no error checking took place.

The length of strings can also be tested using:
-z : check if the string has zero length.
-n : check if the string has a non-zero length.

Combining test: Expressions
Two or more test expressions may be combined, using the -o (or) and/or the -a (and) attributes:

if test $# -eq 0
echo Must provide parameters.
exit 1

if test $# -gt 2 -a $# -lt 5
echo There are 3 or 4 parameters.

if test $# -ge 1 -a $# -lt 3
echo There are 1 or 2 parameters.
exit 0

Note that -a has precedence over -o.

Quoting in Scripts

Confusingly, in Shell scripts no less than three different types of quotes are used, all of which have special meanings. We have already met two of these, and will now consider all three in detail.

Two types of quotes are basically designed to allow you to construct messages and strings. The simplest type of quotes are single quotes; anything between the two quote marks is treated as a simple string. The shell will not attempt to execute or otherwise interpret any words within the string.

The script below simply prints out the message: "your name is fred."

echo 'Your name is fred'

What happens if, rather than always using the name "fred," we want to make the name controlled by a variable? We might then try writing a script like this:

echo 'Your name is $name'

However, this will not do what we want! It will actually output the message "Your name is $name", because anything between the quote marks is treated as literal text - and that includes $name.

For this reason, shells also understand double quotes. The text between double quotes marks is also interpreted as literal text, except that any variables in it are interpreted. If we change the above script to use double quotes, then it will do what we want:

echo "Your name is $name"

The above script writes out the message: "Your name is fred." Double quotes are so useful that we normally use them rather than single quotes, which are only really needed on the rate occasions when you actually want to print out a message with variable names in it.

The third type of quotes are called back-quotes. Back-quotes cause the Shell to treat whatever is between the quotes as a command, which is executed, then to substitute the output of the command in its place. This is the main way to get the results of commands into your script for further manipulation. Use of back-quotes is best described by an example:

echo "Today is $today"

The date command prints out today's date. The above script attempts to use it to print out today's date. However, it does not work! The message printed out is "Today is date". The reason for this is that the assignment today=date simply puts the string "date" into the variable today. What we actually want to do is to execute the date command, and place the output of that command into the today variable. We do this using back-quotes:

echo "Today is $today"

Back-quotes have innumerable uses. Here is another example. This uses the grep command to check whether a file includes the word "and."

# Check for the word "and" in a file
result=`grep and $1`
if [ "$result" ]
echo "The file $1 includes the word and"

The grep command will output any lines in the file which do include the word "and." We assign the results of the grep command to the variable result, by using the back-quotes; so if the file does include any lines with the word "and" in them, result will end up with some text in it, but if the file doesn't include any lines with the word "and" in them, result will end up empty. The if-statement then checks whether result has actually got any text in it.

The test expression test Command

The test command can be used match filenames plus do string and numerical comparisons. For instance, if [ -z "$TEST" ]; ... will be true if the variable TEST is null. File test options are as listed below.

String comparisions are -z for zero length, -n for non-zero length and string1 [!]= string2 for string [in]equality. Numerical comparisons are denoted as [ integer1 operator integer2 ]. Numerical comparison operators are -eq, -ne, -lt, -le, -gt and -ge. Compound expressions can be handled using the conditional operators, and (&&) and or (||). Another option is use of special compound expression built-in operators. These are ! expression, expression1 -a expression2 (&&) and expression1 -o expression2 (||). Thus [ expression1 ] && [ expression2 ] becomes [ expression1 -a expression2 ].
if [ `whoami` != 'oracle' ]; then
echo Aborted - user `whoami` is incorrect, must be user oracle
exit 1
elif [ -z "$1" ] || [ -z "$2" ] || [ -z "$3" ] || [ -z "$4" ] || [ -z "$5" ] || [ -z "$6" ]; then
echo "$USAGE"
exit 1
elif [ -z "$PATH" ] || [ -z "ORACLE_BASE" ] || [ -z "ORACLE_HOME" ] || [ -z "TNS_ADMIN" ] || [ -z "ORACLE_SID" ] || [ -z "ORACLE_DBF" ] || [ -z "ORACLE_SBIN" ] || [ -z "ORACLE_UTILS" ] || [ -z "ORACLE_BACKUP" ] || [ -z "ORACLE_RESTORE" ]; then
echo Variable not defined
exit 1

Looping Commands

Whereas conditional statements allow programs to make choices about what to do, looping commands support repetition. Many scripts are written precisely because some repetitious processing of many files is required, so looping commands are extremely important.

The simplest looping command is the while command. An example is given below:

# Start at month 1
while [ $month -le 12 ]
# Print out the month number
echo "Month no. $month"
# Add one to the month number
month=`expr $month + 1`

echo "Finished"

The above script repeats the while-loop twelve times; with the month number stepping through from 1 to 12. The body of the loop is enclosed between the do and done commands. Every time the while command is executed, it checks whether the condition in the square brackets is true. If it is, then the body of the while-loop is executed, and the computer "loops back" to the while statement again. If it isn't, then the body of the loop is skipped.

If a while-loop is ever to end, something must occur to make the condition become untrue. The above example is a typical example of how a loop can end. Here, the month variable is initially set to one. Each time through the loop it is incremented (i.e. has one added to it); once it reaches 12, the condition fails and the loop ends. This is the standard technique for repeating something a set number of times.

Occasionally, it can actually be useful to loop unconditionally, but to break out of the loop when something happens. You can do this using a while command with a piece of text as the condition (since the piece of text is always there), and a break command to break out of the loop. The computer will go round and round the loop continuously, until such time as it gets to the break statement; it will then go to the end of the loop. The break statement is issued from within an if-statement, so that it only happens when you want to loop to end. The example below loops continuously until the user guesses the right word. If you get inadvertently stuck in such a loop, you can always press Ctrl-C to break out.

This example also demonstrates how a shell script can get input from the user using the read command. The script loops continuously around the while-loop, asking the user for the password and placing their answer in the answer variable. If the answer variable is the same as the password variable, then the break command breaks out of the loop.

# Loop around forever (until the break statement is used)
while [ "forever" ]
# Ask the user for the password
echo "Guess the password to quit the program> \c"
# Read in what they type, and put in it $answer
read answer

# If the answer is the password, break out of the while loop
if [ "$answer" = "$password" ]
# If they get to here, they must've guessed the password,
# because otherwise it would just keep looping
echo "Good guess!"

Another form of looping command, which is useful in other circumstances, is the for command. The for command sets a variable to each of the values in a list, and executes the body of the command once for each value. A simple example is given below:

for name in fred joe harry
echo "Hello $name"

The script above prints out the messages "Hello fred," "Hello joe," and "Hello harry." The command consists of the keyword for, followed by the name of a variable (in this case, $name, but you don't use the dollar in the for-statement itself), followed by the keyword in, followed by a list of values. The variable is set to each value in turn, and the code between the do and done keywords is executed once for each value.

The for-loop is most successful when combined with the ability to use wildcards to match file names in the current directory. The for-loop below uses the * wildcard to match all files and sub-directories in the current directory. Thus, the loop below is executed once for each file or directory, with $file set to each one's name. This script checks whether each one is a directory, using the -d option, and only writes out the name if it is. The effect is to list all the sub-directories, but not the files, in the current directory.

for file in *
if [ -d "$file" ]
echo "$file"

Options and Parameters Passed into Scripts

Options are passed into a script with a preceeding - (minus) sign. Parameters are passed in as space separated strings; strings containing spaces must be enclosed in double quotes. Options can be handled using a case statement of the getopts command. In addition to passing options and parameters into scripts there are a number of specialised variables with special functions. In general parameters are supplied as variable substitions to a script and options change the behaviour of a script.

Input and Output

Two methods of printing to the screen (STDOUT) are use of the echo command and the printf format arguments commands. Strings can use quoting as already explained. Special characters such as \n (newline), \t (tab) and \c (no newline) can be included by using escaping. Simple formatting for the printf command is exactly as occurs in C. The example below shows sinple use of the echo command.

#!/bin/sh for j in *; do if [ -d $j ]; then echo "Directory $j" elif [ -h $j ]; then echo "Link $j" elif [ -f $j ]; then echo "File $j" fi done

Output can be redirected from STDOUT to a file or into STDIN from a file. A single < or > will overwrite, ie. command > file or command < file and two will append, ie. command >> file. Output can also be redirected from STDOUT into the STDIN of another command using a pipe (|) command. For instance, df -k | grep swap | grep -v grep will show available swap space capacity.

User input can be handled using the read command as shown in the example below where a file is read line by line from redirection into the while loop.

while read STRING do ... done < file

Whenever a command is executed three file handles are opened for that command execution. These file handles are STDIN, STDOUT and STDERR; typically responding to file descriptors of 0, 1 and 2 respectively. These file handles can be accessed by use of their file descriptors. All these file descriptors can be redirected to other files or from other files in the case of STDIN.

The /dev/null descriptor will discard STDOUT and STDERR output.

Using Functions

Functions can not be used in C-Shell. A function has the format of name () { command; ... }. Shell functions can be used to replace binaries or shell built-ins of the same name.

cd () { chdir ${1:-$HOME} ; PS1="`pwd`$ "; export PS1; } list () { ls -la; }

The example below checks for the existence of all paths in the directory. Note how the local variable is unset after completion of the loop and note that the local variable is named in lowercase, an underscore character as the first character is sometimes used.

PATH= for dir in $PATH; do if [ -d "$dir" ]; echo "$dir ok" fi done

Functions can be placed into libraries. These libraries can be included into script files by executing those functions within those script files. Note that function library files can only contain function definitions.

#!/bin/sh # #This is the function library # error () { echo "Error : " $@ >&2; } warning () { echo "Warning : " $@ $@ >&2; } email (subject,recipients,message) { if [ -z "$message" ]; then mailx -s $subject $recipients < /dev/null else mailx -s $subject $recipients < $message fi } #!/bin/sh # #This is the scripting calling functions within the function library # ./ #Include the function library ...

Filtering Text

Text filtering can be executed with general Unix utilities, regular expressions, awk and sed).

General Text Filtering Utilities

The utilities head, tail, grep, sort, uniq and tr are all basic text filtering utilities.

awk and sed

sed is a stream editor, awk is a pattern matcher or simple programming language. These are common uses of these two utilities. Both sed and awk are executed as command 'script' files | STDIN. Both sed and awk can be used to match regular expressions or patterns to the contents of input. Perl pattern matching tends to function in a similar fashion to that of sed and awk. General meta-characters used for pattern matching are shown below.

Some pattern matching examples are shown below.

The sed Stream Editor

Patterns can be applied to files using sed where a particular action can be performed on the file content based on the matching results of those patterns in the form s/pattern/change/g where the g causes a global change to the input. p or d in the place of g would print or delete input lines respectively without changing the original input. sed can also be used to perform multiple updates as in sed -e 'command' -e 'command' ... -e 'command' files. sed could also be used to parse input from STDIN and display partial strings of STDIN input, much the same way as grep and awk would perform the same functionality. Personally I prefer grep and awk or even Perl.

Pattern Matching with awk

Pattern matching in awk works the same way as in sed. awk simply has more functionality as a imple programming language. sed is an editor. awk is used for parsing the lines in a text file and taking actions on those lines. awk has very C-like syntax. awk allows if, while and for statements for flow control. awk also allows variable declarations (variable=value) plus passing in of shell variables into awk scripts (awk 'script' var=val var=val ... files. A HREF="../oracle.unixThings.html">See an example in Unix for Oracle under Disk Space and File Management). There is not really much point in going through the syntax of awk in this document since awk syntax is very simplistic. Typically awk in it's most simple form is used to parse files or STDIN and pull specific columns from the output as shown below.

# df -k | awk '{print $1 " " $5}'
Filesystem capacity
/proc 0%
/dev/dsk/c0t0d0s0 84%
fd 0%
swap 1%

Other Useful Unix Utilities

The eval command can be used to process a command line twice. For instance, with the variable REDIRECT set to > file.out the command echo cat $REDIRECT would not be executed but simply would send the text cat > file.out to STDOUT, ie. the screen. In order to execute the command use the eval command as in eval echo cat $REDIRECT.

The : command simply does nothing.

The type command gives the full path name of a Unix command, ie. type command1 command2 ... commandn.

The sleep n command pauses processing for n seconds.

The find command can be used to list files recursively through directories where those filenames match specified criteria. For instance, find all core-dump files on a machine using find ./ -name "core" -print. Using the -print option will restrict printing to the screen by excluding errors produced by non-accessible directories due to restrictive permissions. The format of the find command is find start-directory options actions. The -type f|d|b|c|l|p option allows specification of file types to find, ie. f, d, b, c, l or p (file, directory, block device, character device, link or named pipe). For instance, find / -type d -print finds directories only. The -size [+|-]n option finds only files of less than, greater than or equal to a specified number of blocks. For instance, find / -size +1000 -print finds all files greater than 1000 blocks in size. The find / [-mtime | -atime | -ctime] [+|-]n -print allows finding of files as per last modified (-mtime), last accessed (-atime) or last changed (-ctime). n determines more than, equal to or fewer than number of days from the current date. The -exec option allows execution of a Unix command on any file found by the find command. For example, find / -name "core" -exec rm -f {} \; will delete all core files recursively in the current directory. Be very careful using the -exec option with the find command, especially when executing something like an rm -f command. The results can be very upsetting.

The xargs command is used to provide a list of words from STDIN as arguments to another command, ie. ps -ef | grep ora_ | grep -v grep | xargs kill -9.

The expr command allows simple integer arithmetic, ie. expr 5 \* 12 echo's a result of 60. Available operators are +, - \* (escaped), / and % (modulus). expr can be used in shell scripts to increment variables, eg. VAR=`exp $VAR+1`.

The bc command will perform floating-point arithmetic and is not limited to integers as the expr command is.

The rsh (remote shell) command allows execution of a command from a remote machine, ie. run a command on another machine from the machine one is currently working on. ssh (secure shell) is a similar command but more secure by virtue of it's name and due to encryption and decryption between source and target machines.

Command Exit Status
Checking whether a command succeeds or not

Every command in UNIX should return an exit status, their status is in range 0-255
Only 0 means success. Other statuses indicate various types of failures
Status does not print on screen, but is available thru variable $?
Example shows how to examine exit status of a command
# Experiment with command exit status
echo "The next command should fail and return a status greater than zero"
ls /nosuchdirectory
echo "Status is $? from command: ls /nosuchdirectory"
echo "The next command should succeed and return a status equal to zero"
ls /tmp
echo "Status is $? from command: ls /tmp"

Example shows if block using exit status to force exit on failure
# Use an if block to determine if a command succeeded
echo "This mkdir command fails unless you are root:"
mkdir /no_way
if [ "$?" -ne 0 ]
    # Complain and quit
    echo "Could not create directory /no_way...quitting"
    exit 1  # Set script's exit status to 1
echo "Created directory /no_way"

Exit status is $status in C shell

Bourne Shell Summary


Variables are assigned using the equals sign, no spaces. They are used by preceding them with a dollar character.


Arguments are labelled $1, $2, ..., $9. $# indicates how many arguments there are. shift moves all the arguments down, so that $2 becomes $1, $3 becomes $2, etc. This allows more than nine arguments to be accessed, if necessary.


Single quotes ( 'string' ) form string literals. No interpretation is performed on the string.

Double quotes ( "string" ) form string literals with limited substitution: variables are replaced with their value, and back-quoted commands are replaced with the results of their execution. A backslash '\' at the end of a line allows strings to be stretched over a number of lines.

Back quotes ( `string` ) execute the string in a sub-shell, and substitute in the results of the execution.


if [ <condition> ] then ... elif ... else ... fi. elif and else are optional.

case <string> in <case1>) ... ;; <case2>) ... ;;; esac. The case *) acts as a default for any value not matched by one of the earlier cases.


for <variable> in <list> do ... done

while [ <condition> ] do ... done

You can break out of a loop using the break command; you can jump to the beginning of the loop and begin its next execution using the continue command.


The expr command will do calculations. It usually needs to be enclosed in back-quotes, as the result of the calculation will be assigned to some variable. The arguments to the expr command must be separated by spaces. The value of `expr 3 + 1` is "4", whereas the value of `expr 3+1` (no spaces) is the string "3+1".


User-input can be solicited using the read command, which places what the user types into variables. If users are to type several words, read can be given a number of arguments.

Other Sources
UNIX Bourne Shell Scripting :