avloss
avloss

Reputation: 2646

Python tips for working with an unstable `API`

My app is using a third-party API. It polls that API regularly at several endpoints. In also makes some additional calls to the API based on user's interaction with the app. The API is very slow, most requests take well over a second. The API is very flaky - timeouts are common, 500 errors are common, session key often randomly expires (even when defined "keep_alive" endpoint is called regularly). There is no option to use another API.

What would be the best practices for dealing with such an API?

How to disable concurrent requests to this API on the requests level. So if one request is waiting for a response - the second request is not initiated? This should be done on "per-domain" basis, other requests to other domains should still be done concurrently.

Any other settings to toggle with requests to make it easier to deal with such an API?

Upvotes: 3

Views: 773

Answers (2)

dm03514
dm03514

Reputation: 55972

What would be the best practices for dealing with such an API?

In SRE we pretty much always assume that API's can never be trusted, Because of this there are a number of patterns that may help:

References:

Upvotes: 5

Serge Ballesta
Serge Ballesta

Reputation: 149115

If your main problem is to serialize calls to that API in a multi-threaded (or multi process) application, a simple way would be to wrap it into a new module and consistenly use locking in that module.

If different clients can use a web API concurrently and you need to serialize the requests for performance reasons, you could imagine a dedicated serializing proxy. Just use above method in the proxy.

Upvotes: 2

Related Questions