Jinglei
Jinglei

Reputation: 11

Quota exceeded error in Google Cloud Function provided by Cloudflare

We used the Google Cloud Function provided by Cloudflare to import data from Google Cloud Storage in to Google BigQuery (refer to: https://developers.cloudflare.com/logs/analytics-integrations/google-cloud/). The cloud function was running into an error saying: "Quota exceeded: Your table exceeded quota for imports or query appends per table"

I queried the INFORMATION_SCHEMA.JOBS_BY_PROJECT table and found the errorresult.location is 'load_job_per_table.long'. The jobid is '26bb1792-1ca4-42c6-b61f-54abca74a2ee'.

Looked at the Quotas page for BigQuery API service but non of the quotas status showed exceeded. Some are blank though.

Could anyone help me with which Google Cloud Quota or limit it exceeded? If so, how to increase the quota? The cloudflare function is used by another google account and it works well without any error.

Thanks, Jinglei

Upvotes: 1

Views: 992

Answers (3)

Matt Cobb
Matt Cobb

Reputation: 405

CloudFlare says they have a fix coming for the quota issue: https://github.com/cloudflare/cloudflare-gcp/issues/72

Upvotes: 0

Vibhor Gupta
Vibhor Gupta

Reputation: 699

Identify quota will be done by What different action you are performing through your function, it could be any .. Insert, Update, Volume, External IP and so on. Then try to analysis frequency or metric values and try to evaluate with Google defined quotas it will give you an indicator which quota is getting exceeded. You can refer following video over same.

Upvotes: 0

edwardmoradian
edwardmoradian

Reputation: 88

Try to look for the specific quota error in Cloud Logging where there are log entries. I had a similar issues with BigQuery Data Transfer quota being reached. Please see my example Cloud Logging filter:

resource.type="cloud_function"
severity=ERROR
timestamp>="2021-01-14T00:00:00-08:00"
textPayload:"429 Quota"

Change the timestamp according and maybe remove the textPayload filter. You can also just click in the interface for severe errors and search in the UI.

Here is another example:

severity=ERROR
timestamp>="2021-01-16T00:00:00-08:00"
NOT protoPayload.status.message:"Already Exists: Dataset ga360-bigquery-azuredatalake"
NOT protoPayload.status.message:"Syntax error"
NOT protoPayload.status.message:"Not found"
NOT protoPayload.status.message:"Table name"
NOT textPayload:"project_transfer_config_path"
NOT protoPayload.methodName : "InsertDataset"
textPayload:"429 Quota"

Upvotes: 0

Related Questions