Alberto
Alberto

Reputation: 2982

How to kill all the sub processes of bash script when it exits?

I am using named pipes to communicate between 3 kind of process: a producer, some readers and some writers. Then I created a script that simply it runs all the process, and I would like that when the run.sh process is killed or it terminates all the children process are killed too. The run.sh script follows:

#!/bin/bash
MAX_WRITERS=2
MAX_READERS=2

trap 'jobs -p | xargs kill' EXIT

./producer.sh &
for ((i=1; i<=$MAX_READERS; i++)); do ./reader_one.sh & done
for ((i=1; i<=$MAX_WRITERS; i++)); do ./writer_one.sh & done
wait

The script is very simple but seems it doesn't work because I can still see some processes running (just the producer.sh has been killed).

$ ./run.sh
... some activities ...
^C
$ ps aux | grep "[b]ash ./"
user         48909   ... /bin/bash ./writer_one.sh
user         48906   ... /bin/bash ./writer_one.sh
user         48904   ... /bin/bash ./reader_one.sh
user         48901   ... /bin/bash ./reader_one.sh

So I have to kill with this:

$ ps aux | grep "[b]ash ./" | awk '{print $2}' | xargs kill

How can I make it works?

Update 1: Seems the jobs that has been runned in background with & can resist to the signal SIGINT, so a simple kill is useless. Using kill -9 or the kill -TERM in the trap they will receive the signal, but they remain alive, only the producer.sh has been killed properly.

./run.sh: line 21: 50039 Killed: 9               ./producer.sh
./run.sh: line 21: 50040 Killed: 9               ./reader_one.sh
./run.sh: line 21: 50041 Killed: 9               ./reader_one.sh
./run.sh: line 21: 50042 Killed: 9               ./writer_one.sh
./run.sh: line 21: 50043 Killed: 9               ./writer_one.sh

$ ps aux | grep "[b]ash ./"
user         50069   ... /bin/bash ./writer_one.sh
user         50068   ... /bin/bash ./writer_one.sh
user         50064   ... /bin/bash ./reader_one.sh
user         50061   ... /bin/bash ./reader_one.sh

Update 2: reader_one.sh and writer_one.sh are just waiting for data in a named pipe.

Update 3: Here the code of read_one.sh ; probably the reason is the subshell because of the pipe just before the while

init_read(){
...
}

read_one(){
...
}

init_reader
while true; do
  cat to_read | while read line; do
    read_one $line;
  done
done

Upvotes: 2

Views: 674

Answers (1)

Alberto
Alberto

Reputation: 2982

The suggestion of @Barmar addressed me to the correct way. The problem was related to the subshell created by the piped while, so I changed this in writer_one.sh and reader_one.sh

cat to_read | while read line; do
    read_one $line;
done

to this:

while read line; do
    read_one $line;
done < to_read

Upvotes: 1

Related Questions