TheBandit
TheBandit

Reputation: 109

Python requests efficiency

I am using requests to get the final url of a link:

r = requests.get(link)
link = r.url

While this does work, it is VERY slow. I parse through a very large number of links, and this code makes my program significantly slower. Any link that has a redirection takes a good 5 seconds+ to get the final url. Is there a more efficient way to do this?

Edit: Am I using grequest incorrectly?

>>> r = grequests.get('http://muhlenberg.edu/main/campuslife/sye/index.html/')
>>> print r.url
http://muhlenberg.edu/main/campuslife/sye/index.html/
>>> r = requests.get('http://muhlenberg.edu/main/campuslife/sye/index.html/')
>>>> print r.url
http://www.muhlenberg.edu/errMsg/notFound.html

Upvotes: 1

Views: 148

Answers (1)

lucasnadalutti
lucasnadalutti

Reputation: 5948

Well, if there are links which require 5 or more seconds to be accessed, there's not much to do about it. For your program as a whole, though, making the requests asynchronously with grequests might be a big improvement.

Upvotes: 2

Related Questions