Reputation: 977
What is the concurrent requests limit for Google BigQuery streaming insert?
I found that there is a limit for BigQuery API but it does not apply for streaming insert.
Concurrent API requests, per user: 300
If you make more than 300 concurrent requests per user, throttling might occur. This limit does not apply to streaming inserts.
Upvotes: 1
Views: 6210
Reputation: 173151
Below is available at Streaming Inserts Limits
The following limits apply for streaming data into BigQuery.
- Maximum row size: 1 MB. Exceeding this value will cause invalid errors.
- HTTP request size limit: 10 MB. Exceeding this value will cause invalid errors.
- Maximum rows per second: 100,000 rows per second, per project. Exceeding this amount will cause quotaExceeded errors. The maximum number of rows per second per table is also 100,000. You can use all of this quota on one table or you can divide this quota among several tables in a project.
- Maximum rows per request: 10,000 rows per request. We recommend a maximum of 500 rows.
Batching can increase performance and throughput to a point, but at the cost of per-request latency. Too few rows per request and the overhead of each request can make ingestion inefficient. Too many rows per request and the throughput may drop.- Maximum bytes per second: 100 MB per second, per table. Exceeding this amount will cause quotaExceeded errors.
I think, collectivelly, above sets some sort of concurrency limits
Upvotes: 3