Reputation: 936
We are kicking off bigquery queries that pull data from many tables and write to a single table. We're using Python to run the jobs with the parameters:
job = bq_client.run_async_query(jobname, select_query)
job.destination = bq_table
job.write_disposition = 'WRITE_APPEND'
500 jobs are are kicked off 20 at time. Unfortunately, periodically we run into the error:
Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors
Questions:
Edit:
The queries are against Google Analytics 360 data. For example:
SELECT
...
FROM [{datasetname}.ga_sessions_{date}];
Where each dataset could be a different dataset, and querying across many dates.
I can't share the calculations against the columns, but perhaps this would be enough to come up with a way to consolidate these queries.
Upvotes: 1
Views: 892
Reputation: 172944
assuming SELECT List is the same for all queries
for BigQuery Standard SQL
#standardSQL
SELECT ... FROM `{datasetname1}.ga_sessions_{date}` UNION ALL
SELECT ... FROM `{datasetname2}.ga_sessions_{date}` UNION ALL
...
SELECT ... FROM `{datasetnameN}.ga_sessions_{date}`
for BigQuery Legacy SQL
#legacySQL
SELECT ... FROM
(SELECT ... FROM [{datasetname1}.ga_sessions_{date}]),
(SELECT ... FROM [{datasetname2}.ga_sessions_{date}]),
...
(SELECT ... FROM [{datasetnameN}.ga_sessions_{date}])
Upvotes: 1