Reputation: 1021
I have some code that uses requests
to get a response from an API and uploads the result into a database. I've built a custom Request
class so I can adjust the BACKOFF_MAX
variable. By way of an MRE:
from requests import Session
from requests.adapters import HTTPAdapter, Retry
from time import sleep
class RetryRequest(Retry):
def __init__(self, backoff_max: int, **kwargs):
super().__init__(**kwargs)
self.BACKOFF_MAX = backoff_max
session = Session()
retries_spec = RetryRequest(
total=25,
backoff_factor=0.25,
backoff_max=128,
)
session.mount("https://", HTTPAdapter(max_retries=retries_spec))
headers = {"User-Agent": "*", "Referer": None}
session.headers.update(headers)
while True:
response = session.get(<url>)
# load response into database
sleep(60)
This works fine for a hundreds of iterations of the loop but every so often I get this error:
http.client.RemoteDisconnected: Remote end closed connection without response
Could someone tell me whether this is the result of the retries being exhausted or whether I need some additional error handling here?
Upvotes: 1
Views: 4170
Reputation: 93
It's more a server side error in my opinion. It can be triggered for many reasons probably to deal with DDOS attacks, that's why I suggest you to wrap the get request by a try/except where you make your process sleep in the except part and retry.
OR
You can try to put a higher backoff factor as well
Upvotes: 1