Khaled Koubaa
Khaled Koubaa

Reputation: 92

requests very slow and sometimes return error

I run requests

url = 'https://www.yellowpages.com/boston-ma/mip/the-oceanaire-seafood-room-455904020'
r = requests.get(url)

but sometimes it takes long time and returns the Response object, and sometimes returns an error:

SSLError: HTTPSConnectionPool(host='www.yellowpages.com', port=443): Max retries exceeded with url: /boston-ma/mip/the-oceanaire-seafood-room-455904020 (Caused by SSLError(SSLError("bad handshake: SysCallError(10054, 'WSAECONNRESET')")))

usually, requests is extremely fast, but now, on this webpage and similar from this website, it's unstable. Maybe I need to use another alternative other than requests ? or it's the issue from the webpage ?

Upvotes: 0

Views: 1838

Answers (1)

vauhochzett
vauhochzett

Reputation: 3377

You are being rate-limited. That means that the server counts how often you request data, and stops responding after you have sent a certain number of requests.

The easiest solution to this is to just wait for a bit and then try again. Normally, this resets after a certain amount of time has passed.

If your script sends multiple requests, try adding a delay between your requests, e.g. with time.sleep():

for _ in range(10):
    requests.get(url)
    time.sleep(1)  # One second of delay. You may need to increase this.

Using a Session

If you send multiple requests in a row, you can speed up the requests by utilizing a requests.Session. This makes sure the connection to the server stays open and configured and also persists cookies as a nice benefit. Try this (source):

import requests
session = requests.Session()
url = "..."  # Add your URL here
for _ in range(10):
    session.get(url)

Didn't solve your problem?

If that did not solve your issue, I have collected some other possible solutions here.

Upvotes: 1

Related Questions