Reputation: 51
I have an assigment and i have to read a list of websites from a text file (sites.txt) and check if any of them have static changes since the last time i run the script. My input is
https://en.wikipedia.org/wiki/Stack_Overflow
https://en.wikipedia.org/wiki/Linux
https://en.wikipedia.org/wiki/Linus_Torvalds
If any of the websites is down it should print to stderr the name of the address and the message FAILED like this output example.
https://en.wikipedia.org/wiki/Stack_Overflow FAILED
Also if the input text file contains # it should ignore the line as a comment. My attempt was to create 2 html files old.html and new.html and check in an if statement if the subtraction of the html files is different than 0. My problem is that my output is strangely different than i expected and that the curl command always assumes that the websites are down. The output i have is:
FAILED/en.wikipedia.org/wiki/Stack_Overflow
FAILED/en.wikipedia.org/wiki/Linux
https://en.wikipedia.org/wiki/Linus_Torvalds FAILED
Here is my code:
#!/bin/bash
while read line || [ -n "$line" ]; do
[[ "$line" = "\#*" ]] && continue
if [ "$(curl -s --head --request GET "$line" | grep "200 OK" > /dev/null)" ]; then
mv new.html old.html 2> /dev/null
curl "$line" -L --compressed -s > new.html
DIFF_OUTPUT="$(diff new.html old.html)"
if [ "0" != "${#DIFF_OUTPUT}" ]; then
echo "$line Changed"
fi
else
echo "$line FAILED" >&2
fi
done <"$1"
Can anyone help me?
Upvotes: 1
Views: 3539
Reputation: 1
LINK="https://www.google.com"
if [ "$(curl -s --head --request GET "$LINK" | grep "200 OK" | wc -l)" ]; then
echo "okey"
else
echo "FALSE"
fi
Upvotes: -2
Reputation: 362117
sites.txt
has DOS line endings \r\n
rather than UNIX line endings \n
. The \r
carriage returns cause the cursor to move back to the first column. You need to either convert sites.txt
to UNIX format or delete the carriage returns from $line
.
Upvotes: 1