Tobias
Tobias

Reputation: 301

Why do I keep getting "Quota exceededfor quota group 'AnalyticsDefaultGroup' and limit 'USER-100s'" Errors?

I am currently managing two Google Analytics Management Accounts with many clients and view_ids on each one. The task is to request client data via the Google Analytics Reporting API (v4) and store them to a SQL Backend on a daily basis via an Airflow DAG-structure.

For the first account everything works fine. Just recently I added the second account to the data request routine. The problem is that even though both accounts are set to the same "USER-100s" quota limits, I keep getting this error for the newly added account:

googleapiclient.errors.HttpError: <HttpError 429 when requesting https://analyticsreporting.googleapis.com/v4/reports:batchGet?alt=json returned "Quota exceeded for quota group 'AnalyticsDefaultGroup' and limit 'USER-100s' of service 'analyticsreporting.googleapis.com' for consumer 'project_number:XXXXXXXXXXXX'.">

I already set the quota limit "User-100s" from 100 to the maximum of 1000, as recommended in the official Google guidelines (https://developers.google.com/analytics/devguides/config/mgmt/v3/limits-quotas)

Also I checked the Google API Console and the number of requests for my project number, but I never exceeded the 1000 requests per 100 seconds so far (see request history account 2), while the first account always works(see request history account 1). Still the above error appeared.

Also I could rule out the possibility that the 2nd account's clients simply have more data.

request history account 1

request history account 2

I am now down to a try-except loop that keeps on requesting until the data is eventually queried successfully, like

success = False
data = None
while not success:
    try:
        data = query_data() # trying to receive data from the API
        if data:
            success = True
    except HttpError as e:
        print(e)

This is not elegant at all and bad for maintaining (like integration tests). In addition, it is very time and resource intensive, because the loop might sometimes run indefinitely. It can only be a workaround for a short time.

This is especially frustrating, because the same implementation works with the first account, that makes more requests, but fails with the second account.

If you know any solution to this, I would be very happy to know.

Cheers Tobi

Upvotes: 4

Views: 1404

Answers (1)

Eduardo Veras
Eduardo Veras

Reputation: 179

I know this question is here for a while, but let me try to help you. :)

There are 3 standard request limits:

  • 50k per day per project
  • 2k per 100 seconds per project
  • 100 per 100 seconds per user

As you showed in your image (https://i.sstatic.net/Tp76P.png)

The quota group "AnalyticsDefaultGroup" refers to your API project and the user quota is included in this limit.

Per your description, you are hitting the user quota and that usually happens when you don't provide the userIP or quotaUser in your requests.

So there is to main points you have to handle, to prevent those errors:

  • Include the quotaUser with a unique string in every request;
  • Keep 1 request per second

By your code, I will presume that you are using the default Google API Client for Python (https://github.com/googleapis/google-api-python-client), which don't have a global way to define the quotaUser.

To include the quotaUser

analytics.reports().batchGet(
    body={
        'reportRequests': [{
            'viewId': 'your_view_id',
            'dateRanges': [{'startDate': '2020-01-01', 'endDate': 'today'}],
            'pageSize': '1000',
            'pageToken': pageToken,
            'metrics': [],
            'dimensions': []
        }]
    },
    quotaUser='my-user-1'
).execute()

That will make to Google API register you request for that user, using 1 of the 100 user limit, and not the same for your whole project.

Limit 1 request per second

If you plan to make a lot of requests, I suggest including a delay between every request using:

time.sleep(1)

right after a request on the API. That way you can keep under 100 requests per 100 seconds.

I hoped I helped. :)

Upvotes: 3

Related Questions