Alois Mahdal
Alois Mahdal

Reputation: 11243

Is it possible to suppress "skipping" behavior between wget tries?

I'm using wget to download a set of files via HTTP, using one wget call per URL, in a simple cmd.exe batch.

Also, I alternate between mirrors randomly and want to keep separate tree for each mirror, like:

http://server06//files/file1.txt  -> temp\server06\files\file1.txt
http://server03//files/file65.txt -> temp\server03\files\file65.txt

What I do now is:

echo !url! | .\runners\wget.exe --tries=3 --force-directories --directory-prefix=.\temp\ --input-file=-

Sometimes it happens that, for some reason, server closes TCP connection. I'm using --tries=3 to work around this. In such case, default behavior of wget is, that it would skip the bytes that it already downloaded, and continue from that point, something like this:

2011-07-19 13:24:52 (68.1 KB/s) - Connection closed at byte 65396. Retrying.

--2011-07-19 13:24:54--  (try: 3) 
http://server06//files/filex.txt
Connecting to server|10.10.0.108|:80... failed: Unknown error.
Resolving server... 10.10.0.108
Connecting to server|10.10.0.108|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 166400 (163K), 101004 (99K) remaining [text/plain]
Saving to:
`./temp/server06/files/filex.txt'

        [ skipping 50K ]
    50K ,,,,,,,,,, ,,,....... .......... .......... .......... 61% 2.65M 0s
   100K .......... .......... .......... .......... .......... 92% 1.62M 0s
   150K .......... ..                                         100% 1.64M=0.06s

utime(./temp/server06/files/filex.txt):
Permission denied
2011-07-19 13:25:15 (1.72 MB/s) -
`./temp/server06/files/filex.txt'
saved [166400/166400]

My problem is that I don't want wget to download the file in two parts. I want wget to try more times, but if any attempt fails for any reason, I want it to start over (even at the cost of not downloading the file at all!).

The background is that I'm testing a code in a filter driver that will be covered only if the file is downloaded at one piece. And my tests fail because of this behavior.

Question is: is it possible to suppress this behavior? I.e. make wget try as much as is configured by a parameter, while either downloading complete file or zero bytes within each attempt?

Or I should look for another workaround?

Upvotes: 0

Views: 879

Answers (1)

Pete Wilson
Pete Wilson

Reputation: 8694

I am sure you will be happier with the libcurl library. It takes just one call per url and libcurl does all the rest of the work. On top of that, there's first-rate support for the package.

The particular case you're having trouble with won't be a problem using libcurl.

HTH

Upvotes: 1

Related Questions