0x00b0
0x00b0

Reputation: 363

Multithreading in bash scripting

I run a bash script, and looping as much line in text file. to cURL the site listed in the txt file.

here is my script :

SECRET_KEY='zuhahaha'
FILE_NAME=""

case "$1" in
        "sma")     
            FILE_NAME="sma.txt"
        ;;
        "smk")      
            FILE_NAME="smk.txt"
        ;;
        "smp")      
            FILE_NAME="smp.txt"
        ;;
        "sd")      
            FILE_NAME="sd.txt"
        ;;
     *)
        echo "not in case !"
        ;;
esac

function save_log()
{
    printf '%s\n' \
    "Header Code    : $1" \
    "Executed at    : $(date)" \
    "Response Body  : $2" \
    "====================================================================================================="$'\r\n\n'  >> output.log
}

while IFS= read -r line; 
    do 
        HTTP_RESPONSE=$(curl -L -s -w "HTTPSTATUS:%{http_code}\\n" -H "X-Gitlab-Event: Push Hook" -H 'X-Gitlab-Token: '$SECRET_KEY --insecure $line 2>&1) &
        HTTP_BODY=$(echo $HTTP_RESPONSE | sed -e 's/HTTPSTATUS\:.*//g') &
        HTTP_STATUS=$(echo $HTTP_RESPONSE | tr -d '\n' | sed -e 's/.*HTTPSTATUS://') &

        save_log "$HTTP_STATUS" "$HTTP_BODY" &
done < $FILE_NAME

how i can run threading or make the loop fast in bash ?

Upvotes: 1

Views: 787

Answers (2)

dr-who
dr-who

Reputation: 189

My favourite was to do this is generate a file that lists all the commands you wish to perform. If you have a script that performs your operations create a file like:

$ cat commands.txt
echo 1
echo 2
echo $[12+3]
....

For example this could be hundreds of commands long.

To execute each line in parallel, use the parallel command with, say, at most 3 jobs running in parallel at any time.

$ cat commands.txt | parallel -j
1
2
15

For your curl example you could generate thousands of curl commands, execute them say 30 in parallel at any one time.

Upvotes: 1

prushik
prushik

Reputation: 331

You should be able to do this relatively easily. Don't try to background each command, but instead put the body of your while loop into a subshell and background that. That way, your commands (which clearly depend on each other) run sequentially, but all the lines in the file can be process in parallel.

while IFS= read -r line; 
    do
       (
        HTTP_RESPONSE=$(curl -L -s -w "HTTPSTATUS:%{http_code}\\n" -H "X-Gitlab-Event: Push Hook" -H 'X-Gitlab-Token: '$SECRET_KEY --insecure $line 2>&1)
        HTTP_BODY=$(echo $HTTP_RESPONSE | sed -e 's/HTTPSTATUS\:.*//g')
        HTTP_STATUS=$(echo $HTTP_RESPONSE | tr -d '\n' | sed -e 's/.*HTTPSTATUS://')

        save_log "$HTTP_STATUS" "$HTTP_BODY" ) &
done < $FILE_NAME

Upvotes: 3

Related Questions