Mena Ortega
Mena Ortega

Reputation: 587

Shell script using curl to loop through urls

I've been trying to create a simple script that will take a list of queries from a .txt file, append the main url variable, then scrape the content and output it to a text file.

Here's what I have so far:

#!/bin/bash

url="example.com/?q="
for i in $(cat query.txt); do
    content=$(curl -o $url $i)
    echo $url $i
    echo $content >> output.txt
done

list:

images
news
stuff
other

error log:

curl: (6) Could not resolve host: other; nodename nor servname provided, or not known
example.com/?q= other

If I use this command straight from the command line I get some output into the file:

curl -L http://example.com/?q=other >> output.txt

Ultimately I would like the output to be:

fetched:    http://example.com/?q=other
content:    the output of the page

followed by the next query in the list.

Upvotes: 29

Views: 94328

Answers (2)

David George
David George

Reputation: 3752

You've got nested quotes, try something like this:

#!/bin/bash

url=https://www.google.fr/?q=
while read query
do
    content=$(curl "{$url}${query}")
    echo $query
    echo $content >> output.txt
done < query.txt

Upvotes: 5

Gilles Qu&#233;not
Gilles Qu&#233;not

Reputation: 184985

Use more quotes !

Try this instead :

url="example.com/?q="
for i in $(cat query.txt); do
    content="$(curl -s "$url/$i")"
    echo "$content" >> output.txt
done

Upvotes: 37

Related Questions