user3329732
user3329732

Reputation: 343

How to kill all parallel Bash/Linux tasks on condition of single instance

I have a parallelised Bash function:

#!/bin/bash

touch out.txt
vals=$(seq 1 1 20)
task(){
    echo $1 >> out.txt
    x=$(wc -l out.txt | cut -d' ' -f 1) #read current length of file
    if [ $x > 5 ]
    then
    echo $x #NB prints even if <5!!
    KILL_ALL_JOBS_HERE_ON_IF_THEN_CONDITION
    fi
}

export -f task

echo $vals | xargs -P2 -I'{}' -d' ' bash -c 'task "$1"' bash {} 

The function starts to echo the twenty values to a single text file. However, when the file is longer than 5 lines, I would like all processes/instances to stop and not process anymore values from the variable. How can I do it? (NB a minor issue, the if-then statement is also weird - it echoes the processed value, $x, irrespective of whether it's >5).

Upvotes: 0

Views: 161

Answers (1)

KamilCuk
KamilCuk

Reputation: 141483

[ $x > 5 ]

is the same as:

[ $x ] > 5 

it redirects the output of [ command to a file named 5. Just like echo > 5

You want:

[ "$x" -gt 5 ]

 KILL_ALL_JOBS_HERE_ON_IF_THEN_CONDITION

If you are using GNU xargs, them from man xargs:

If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens.

Just exit 255.

Upvotes: 5

Related Questions