Reputation: 1453
I'm running through a bunch of data like this:
for id in `cat ids.list` ; do echo $id ; bin/migrate.pl --id $id ; done
I'm having a problem where after a couple hundred $ids the migrate.pl script dies with a timeout error from my backend (though I'm not sure I believe the error message).
But when the migrate.pl script dies, the whole bash for-loop stops too. I would expect bash to continue on to the migrate.pl with next $id. How can the script be killing for for-loop? I find that surprising, and I'm not able to reproduce it with any other mechanism, and I wonder if it might be related to my problem.
Upvotes: 1
Views: 105
Reputation: 11603
This will try to buffer the output cat ids.list
and likely get an overflow for large files. You should instead do something like
while read -r line; do
for id in $line; do
echo "$id"
bin/migrate.pl --id "$id"
done
done < ids.list
Upvotes: 1