Reputation: 33
I'm trying to download a directory (incl subs) from a website. I'm using:
wget -r -e robots=off --no-parent --reject "index.html*" http://example.com/directory1/
Problem is, the server refuses connection after a bit, I think there's too many connections within a short amount of time. So what I'd like to do is insert a wait time (5 seconds) between each download/lookup. Is that possible? If so, how?
Upvotes: 3
Views: 4177
Reputation: 27822
You can use --wait
. From wget(1)
:
-w seconds
--wait=seconds
Wait the specified number of seconds between the retrievals. Use
of this option is recommended, as it lightens the server load by
making the requests less frequent. Instead of in seconds, the time
can be specified in minutes using the "m" suffix, in hours using
"h" suffix, or in days using "d" suffix.
Specifying a large value for this option is useful if the network
or the destination host is down, so that Wget can wait long enough
to reasonably expect the network error to be fixed before the
retry. The waiting interval specified by this function is
influenced by "--random-wait", which see.
I didn't know this either, but I found this answer in 15 seconds by using the wget
manpage:
man wget
./
to search, so I used /wait
.q
to quit.Upvotes: 7