Shabbir
Shabbir

Reputation: 23

Testing a list of URLs with Bash

I have limited understanding about cron commands, I have tried using one particular url, but I would like the script to get a comma delimited list of URL’s and check their http response every 5 minutes for 30 minutes. Then output the result. My guess is:

5 * * * * /usr/bin/wget "www.example.com" --timeout 30 -O - 2>/dev/null  | \
          grep "Normal operation string" || echo "The site is down" | \
          /usr/bin/mail -v -s "Site is down" [email protected]

But it works for one particular predefined website only

Upvotes: 2

Views: 2825

Answers (1)

miken32
miken32

Reputation: 42711

You're rather limiting yourself by trying to fit this onto one line. Just put your commands into a script, save it somewhere as executable, and then call that script from the cron file.

To get multiple sites, just set them up in an array and then iterate over it:

#!/bin/bash
declare -a sites
sites=("www.example.com" "www.example.org" "www.example.net")
for site in "${sites[@]}"; do
    if ! wget "$site" --timeout 30 -O - 2> /dev/null | grep "Normal operation string"; then
        echo "The site is down" | mail -v -s "Site is down"  [email protected]
    fi
done

Or you can read them in from a file. This assumes they are stored in sites.txt, one site per line:

#!/bin/bash
while read -r site; do
    if ! wget "$site" --timeout 30 -O - 2> /dev/null | grep "Normal operation string"; then
        echo "The site is down" | mail -v -s "Site is down"  [email protected]
    fi
done < sites.txt

Upvotes: 5

Related Questions