aquavitae
aquavitae

Reputation: 19154

Define download size for wget or curl

As part of a bash script I need to download a file with a known file size, but I'm having issues with the download itself. The file only gets partially downloaded every time. The server I'm downloading from doesn't seem particularly well set up - it doesn't report file size so wget (which I'm using currently) doesn't know how much data to expect. However, I know the exact size of the file, so theoretically I could tell wget what to expect. Does anyone know if there is a way to do this? I'm using wget at the moment but I can easily switch to curl if it will work better. I know how to adjust timeouts (which might help too), and retries, but I assume that for retries to work it needs to know the size of the file its downloading.

I have seen a couple of other questions indicating that it might be a cookie problem, but that's not it in my case. The actual size downloaded varies from <1Mb to 50Mb, so it looks more like some sort of lost connection.

Upvotes: 0

Views: 1145

Answers (1)

ezeed
ezeed

Reputation: 58

Could you share the entire command to check what parameters are you using? however, it's a strange case. You may use the -c parameter, restore the connection in the same point where it stopped after the retries. Or you can try using --spider parameter. That checks if the file exists and get the info file in log.

Upvotes: 1

Related Questions