Reputation: 105133
I have an pages.txt
file with 100 URLs inside. I want to check them one by one and fail on the first problem. This is what I'm doing:
cat pages.txt | xargs -n 1 curl --silent \
--output /dev/null --write-out '%{url_effective}: %{http_code}\n'; echo $?
Exit code is 1
, but I see it only when the entire file is done. How to stop earlier, on the first problem?
Upvotes: 56
Views: 19970
Reputation: 579
I haven't found a way to do what you ask for with xargs, but a loop with read might be what you are looking for.
while read URL; do
curl --silent \
--output /dev/null --write-out '%{url_effective}: %{http_code}\n' $URL;
RET=$?;
echo $RET;
if [ $RET -ne 0 ]; then break; fi
done < pages.txt
Upvotes: 3
Reputation: 8531
xargs -n 1 sh -c '<your_command> $0 || exit 255' < input
xargs -n 1 sh -c 'curl --silent --output /dev/null \
--write-out "%{url_effective}: %{http_code}\n" $0 || exit 255' < pages.txt
For every URL in pages.txt
, executes sh -c 'curl ... $0 || exit 255'
one by one (-n 1
) forcing to exit with 255
if the command fails.
From man xargs
:
If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens.
Upvotes: 85