Bech
Bech

Reputation: 21

Google Drive API request fail if more than 10 request are done within a second

I'm the developer of an application where user requests leades to calls against the Google Drive API. All the requests against the Google Drive API are done server-side using the Java API from an array/replicas of servers. We are currently running around five servers.

We now see that sometimes the requests from the users are peaking in a way, so more than 10 simultaneous requests are done against the Google Drive API. This makes some requests fail with the message:

User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=xxxxxxxxx

I have created a unit test where I simultaneous start 11 threads doing a get request for a Google Drive file. Consistently one of the requests fail while the other 10 are successful. Both for the unit test and production code I set the quotaUser parameter to a unique value.

The Google Analytics API has a quota for maximum number of request per second as described here: https://developers.google.com/analytics/devguides/reporting/mcf/v3/limits-quotas

10 queries per second (QPS) per IP addres

I have not been able to find documentation for the same '10 request/second' quota for the Google Drive API, so I do not expect it to exist.

Throttling functionality is really not an option for me, as I might in the future will have more that 10 server replicas and therefor will be unable to prevent more than 10 simultaneous requests.

In general I'm no way near the general project quota request limitation configuration.

Why do get this 10 request/second quota limitation?

Upvotes: 0

Views: 1888

Answers (2)

pinoyyid
pinoyyid

Reputation: 22306

Drive uses Token/Bucket rate limiting for Drive. See https://cloud.google.com/solutions/rate-limiting-strategies-techniques

In my experience the initial bucket size is 25ish and the replenish rate is around 1 per second. Therefore to get the maximum throughput you need to adaptively throttle your requests based on the 403 errors that you receive, but aiming for 1 per second after the first 25 will be more or less optimal.

If your problem is not throughput, but concurrency, then it might be worth creating multiple credentials for each server and see if that bypasses the concurrency problem. NB I haven't tested this, so please share the outcome if you try it. The next level of sharding would be to create multiple applications, each of which shares access to your Drive corpus.

An alternative approach would be to change your server architecture (yes - I know that sucks), in order to proxy Drive API calls through a 2-10 proxy servers in order to ensure the concurrency is below 10.

Upvotes: 0

Linda Lawton - DaImTo
Linda Lawton - DaImTo

Reputation: 117244

User rate limits are standard across ALL google APIs. It is also subject to change without warning or notification. Just because it's not documented under google drive does not mean it doesn't exist. If your getting the error then it does in fact exist.

The 403_error_user_rate_limit_exceeded is a very generic error message that means you are flooding the server and need to slow down. The Google apis were not meant to be multi threaded by the same project in this manner.

You should consider looking into some of the recommended ways of dealing with this error message.

To fix this error:

  • Raise the per-user quota in the Developer Console project. For more information, see Request additional quota.
  • If one user is making a lot of requests on behalf of many users of a Google Workspace domain, consider a Service Account with authority delegation (setting the quotaUser parameter).
  • Use exponential backoff to retry the request.

I have an article i wrote a while back which explains how i attempted to work with the flood protection. Flood buster it may help.

Either way add quota user, and run fewer threads should help.

Upvotes: 1

Related Questions