Reputation: 2338
I have a parent script, that execute a child script in background:
#!/bin/bash
# parent.sh
childScript $param1 $param2&
Child script:
#!/bin/bash
# childScript.sh
param1=$1
param2=$2
someLinuxCommand $param1 $param2
out=$?
echo $out
If I execute childScript.sh with correct $param1 and $param2, $? will return 0. If $param1 and $param2 are incorrect, $? will return 1.
But no matter what $param1 and $param2 I send using parent.sh, $? always return 0. Why if I send incorrect $param1 and $param2 from parent.sh, $? in childScript.sh return 0?
Upvotes: 0
Views: 168
Reputation: 10460
In your child script you are "returning" the result of echo
which will always be 0. You should be using ...
exit $?
... instead. Or just leave that line out all together.
Here is an example that apes your scripts:
$ cat parent.sh
#!/bin/bash
p1=$1
p2='file'
./child.sh $p1 $p2
$ cat child.sh
#!/bin/sh
grep $1 $2
out=$?
echo $out
The child script will "grep" for the pattern in the "file". Here are the contents of the file "file".
$ cat file
c.sh
file
in.txt
p.sh
bill
If grep find the pattern in the file grep will succeed thus setting $? to 0. But if grep does not find the pattern in the file grep will fail this setting $? to 1.
Here we run parent with a pattern of "bob"
$ ./parent.sh bob
1
grep
did not find bob in thus sets $?
to 1. echo
outputs 1 and then sets $?
to 0.
$ echo $?
0
Let's fix the child.sh script to be:
$ cat child.sh
#!/bin/sh
grep $1 $2
and run parent.sh again:
$ ./parent.sh bob
$ echo $?
1
$ ./parent.sh bill
bill
$ echo $?
0
Upvotes: 1