user1192422
user1192422

Reputation: 151

bash: read text file, get HTTP headers, save results

I'm using Terminal in OS X 10.8 The challenge is to read text file (where each line is an url), get HTTP headers, save results to other text file

Tried this:

for line in cat ~/Desktop/a.txt; do curl -I line > ~/Desktop/b.txt;  done

plus multiple loop examples like

(while read l; do echo $l; done) < ~/Desktop/a.txt 

or

cat ~/Desktop/a.txt | while read CMD; do
echo $CMD
done

It seems to me that I cannot create simple loop. Please, advise. Best,

Upvotes: 0

Views: 1045

Answers (1)

jens-na
jens-na

Reputation: 2274

You can try something like this:

for i in $(cat test); do curl -I $i >> test2; done

This reads everything in the file test and appends the curl output to the file test2.

Upvotes: 1

Related Questions