cbix
cbix

Reputation: 498

AWK: execute CURL on each line and parse result

given an input stream with following lines:

123
456
789
098
...

I would like to call

curl -s http://foo.bar/some.php?id=xxx

with xxx being the number for each line, and everytime let an awk script fetch some information from the curl output which is written to the output stream. I am wondering if this is possible without using the awk "system()" call in following way:

cat lines | grep "^[0-9]*$" | awk '
    {
        system("curl -s " $0 \
        " | awk \'{ #parsing; print }\'")
    }'

Upvotes: 7

Views: 15234

Answers (3)

user4401178
user4401178

Reputation:

If your file gets dynamically appended the id's, you can daemonize a small while loop to keep checking for more data in the file, like this:

while IFS= read -d $'\n' -r a || sleep 1; do [[ -n "$a" ]] && curl -s "http://foo.bar/some.php?id=${a}"; done < lines.txt

Otherwise if it's static, you can change the sleep 1 to break and it will read the file and quit when there is no data left, pretty useful to know how to do.

Upvotes: 0

Foo Bah
Foo Bah

Reputation: 26271

You can use bash and avoid awk system call:

grep "^[0-9]*$" lines | while read line; do
    curl -s "http://foo.bar/some.php?id=$line" | awk 'do your parsing ...'
done

Upvotes: 1

Tony
Tony

Reputation: 17

A shell loop would achieve a similar result, as follows:

#!/bin/bash
for f in $(cat lines|grep "^[0-9]*$"); do 
    curl -s "http://foo.bar/some.php?id=$f" | awk '{....}'
done

Alternative methods for doing similar tasks include using Perl or Python with an HTTP client.

Upvotes: 0

Related Questions