Reputation: 1
I'm trying to build a scalable ETL process that writes on different tables of BigQuery. This process is triggered by PubSub, so n Cloud Functions are instantiated (by default all of them are in us-central1) to acchieve the goal. I'm using the BigQuery Storage Write API, python client, to stream the data into BigQuery.
Usually when more than 100 CFs are instantiated I'm getting the following resumed error message:
429 Exceeds quota limit subject: ?metric=bigquerystorage.googleapis.com/write/create_stream_requests&limit=CreateWriteStreamRequestsPerMinutePerProject&qs_error_code=INSUFFICIENT_TOKENS description: Quota exceeded for quota metric 'CreateWriteStream requests' and limit 'CreateWriteStream requests per minute' of service 'bigquerystorage.googleapis.com'...
Figuring out the quota limits here at docs, I'm wondering why I'm getting this error? Shouldn't it be limited by 10k of concurrent connections?
https://cloud.google.com/bigquery/quotas#write-api-limits
Your project can operate on 10,000 concurrent connections in the us and eu multi-regions, and 100 in other regions.
Or by region they mean to say n < 10k that the job has this limit when splited inside, for example, of the 'us' region, like us-central1, us-central2, and so on?
CreateWriteStream limits, showing 10k but accusing quota limit when +100 concurrency connections
Upvotes: 0
Views: 1371
Reputation: 132
There are some quotas that might be at play here:
Note that these might change in the future.
Upvotes: 0