scandalous
scandalous

Reputation: 912

python requests is slow

I am developing a download manager. Using the requests module in python to check for a valid link (and hopefully broken links). My code for checking link below:

url = 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe'
r = requests.get(url, allow_redirects=False) # this line takes 40 seconds
if r.status_code==200:
    print("link valid")
else:
    print("link invalid")

Now, the issue is this takes approximately 40 seconds to perform this check, which is huge. My question is how can I speed this up maybe using urllib2 or something??

Note: Also if I replace url with the actual URL which is 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe', this takes one second so it appears to be an issue with requests.

Upvotes: 5

Views: 19853

Answers (2)

michaelmeyer
michaelmeyer

Reputation: 8215

Not all hosts support head requests. You can use this instead:

r = requests.get(url, stream=True)

This actually only download the headers, not the response content. Moreover, if the idea is to get the file afterwards, you don't have to make another request.

See here for more infos.

Upvotes: 12

Jon Clements
Jon Clements

Reputation: 142226

Don't use get that actually retrieves the file, use:

r = requests.head(url,allow_redirects=False)

Which goes from 6.9secs on my machine to 0.4secs

Upvotes: 11

Related Questions