SvbZ3r0
SvbZ3r0

Reputation: 638

URLretrieve - Return control to program when download fails/network error

I'm writing a script to download videos from a website, using urlretrieve. My internet connection is somewhat erratic and goes down now and then.
When my network fails, urlretrieve hangs, and doesn't pass control back to my program so I can handle the error.
How do I go about solving this problem?
Or should I use a different library for this purpose? If so, which is the best one (considering all the other features of urllib are more than sufficient for my use and the files I download are around 500 - 600 MB)?

Upvotes: 0

Views: 399

Answers (1)

Georg Grab
Georg Grab

Reputation: 2301

Use the requests library. Requests will throw a ConnectionError exception when problems with the Network arise. Refer to this stackoverflow thread for how to go about downloading large files using requests.

If you're annoyed by the download starting all over again once the exception arises, look into the HTTP Range header, with which you'll be able to resume the download (provided you're saving the bytes already retreived somewhere in your exception handling code.)

Upvotes: 1

Related Questions