ks1322
ks1322

Reputation: 35716

How to interrupt bash pipeline on error?

In the following example echo statement gets executed regardless of exit code of previous command in pipeline:

asemenov@cpp-01-ubuntu:~$ 
asemenov@cpp-01-ubuntu:~$ false|echo 123
123
asemenov@cpp-01-ubuntu:~$ true|echo 123
123
asemenov@cpp-01-ubuntu:~$ 

I want echo command to execute only on zero exit code of previous command, that is I want to achieve this behavior:

asemenov@cpp-01-ubuntu:~$ false|echo 123
asemenov@cpp-01-ubuntu:~$ 

Is it possible in bash?

Here is a more practical example:

asemenov@cpp-01-ubuntu:~$ find SomeNotExistingDir|xargs ls -1
find: `SomeNotExistingDir': No such file or directory
..
..
files list from my current directory
..
..
asemenov@cpp-01-ubuntu:~$ 

There is no reason to execute xargs ls -1 if find failed.

Upvotes: 5

Views: 2903

Answers (5)

William Pursell
William Pursell

Reputation: 212248

For the particular example you give, it's sufficient to simply check if there is any data on the pipe. The problem you experience is that xargs is getting no input, so it invokes ls with no arguments, and ls be default prints the contents of the current directory. anishsane's solution is almost sufficient, but it is not quite the same since it will invoke ls for each line of output, which is not at all what xargs does. However, you can do:

find /bad/path | xargs sh -c 'test $# = 0 || exec ls -1 "$@"'

Now, this pipe line will always succeed, and perhaps that is not desirable (this is the same behavior you get with just find /bad/path | xargs ls -l, though) To ensure that the pipeline fails, you can do:

find /bad/path | xargs sh -c 'test $# = 0 && exit 1; exec ls -1 "$@"'

There are some concerns however. xargs will quite happily invoke its command with many arguments (that is the point of it!), but some shells will handle a much smaller number of arguements than xargs, so it is quite possible that the shell will truncate the arguments. However, that is possibly an academic concern.

Upvotes: 0

Pitt
Pitt

Reputation: 974

In terms of command flow, the easiest way to do what you want would be to use the logical OR operator, like this:

[pierrep@DEVELOPMENT8 ~]: false || echo 123
123
[pierrep@DEVELOPMENT8 ~]: true || echo 123
[pierrep@DEVELOPMENT8 ~]:

This works since the || operator is evaluated in a lazy fashion, meaning that the right statement is only evaluated when the left statement evaluated to false or 1.

note: commands which are run successfully return exit status 0 when successful. Something other than 0 when they are not. in your example with find:

[pierrep@DEVELOPMENT8 ~]: find somedir || echo 123
find: `somedir': No such file or directory
123
[pierrep@DEVELOPMENT8 ~]: find .profile || echo 123
.profile

Using || wont redirect any kind of output from the command on the left of the ||. If you want to run some command only when one succeeds you should just do a basic exit code check and temporarily store the output of one command in a variable in your script in order to feed it to the next command, like so:

$result=( $(find SomeNotExistingDir) )
$exit_code=$?
if [ $exit_code -eq 0 ]; then
    for path in ${result[@]}; do
        #do some stuff with the find results here...
        echo $path;
    done
fi

What this does: When find is run, it puts its results into the $result array. $? holds the exit code of the last run command, so here it is the find command. If find found SomeNotExisitingDir then loop through its results (since it might have found multiple instances of it) and do stuff with those paths. Else do nothing. Here else would be triggered when an error occurred in the execution of the find command or when the file/dir could not be found.

Upvotes: 1

Jonathan Leffler
Jonathan Leffler

Reputation: 753970

The components of a pipeline are always run unconditionally and logically in parallel; you cannot make the second (or later) processes in the pipeline only if the first (or earlier) process completes successfully.

In the specific case you show with find, you have at least two options:

find SomeNotExistingDir ... -exec ls -1 {} +

Or you can use a very useful feature of GNU xargs (not present in POSIX):

find SomeNotExistingDir ... | xargs -r ls -1

The -r option is equivalent to --no-run-if-empty option, which explains fairly precisely what it does. If you're using GNU find and GNU xargs, you should use the extensions -print0 and -0:

find SomeNotExistingDir ... -print0 | xargs -r -0 ls -1

This handles every character that can appear in a file name correctly.

Upvotes: 1

anishsane
anishsane

Reputation: 20980

the point is that when the first command fails there is no output for the second command and no reason to execute it - the result of this behavior becomes unexpected

If there is NO output on stdout when exit code is non-zero, then this information itself can be used for piping the data. No need to check for exit code. (Except for the optimization part off course.)

e.g. If you ignore optimization part, consider only the correctness part,

find SomeNotExistingDir|xargs ls -1

Can be changed to

find SomeNotExistingDir| while read x; do ls -1 "$x"; done

Except for while loop, the commands inside it will not be executed. The downfall of this approach is, some information (like line numbers) will be lost for commands like awk/sed/head etc. to be used in place of ls. Plus, ls will be executed N number of times, instead of 1, in case of xargs approach.

Upvotes: 0

perreal
perreal

Reputation: 97948

You can't do that with pipes because pipe creation will not wait for completion, other wise how could cat | tr 'a-z' 'A-Z' work?. Simulating pipes with test and temp files:

file1=$(mktemp)
file2=$(mktemp)
false > $file1 && (echo 123 > $file2) < $file1 && (prog3 > $file1) < $file2 && #.... 
rm $file1 $file2

Upvotes: 0

Related Questions