Reputation: 110502
Using jobs.query
has the limitation of 10MB
byteSize, and so I have to pagination/request 10 times in order to retrieve 100MB
data in BigQuery.
I am looking for how to run a query and download about the result (~100MB) in the fastest possible time. What would be the suggested way to do this?
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query
Additionally, is there a way to retrieve the query result in a compressed format (if that would speed things up at all?)
Upvotes: 0
Views: 640
Reputation: 59325
Either:
Run an export job. Specify gzip compression. Load the file from GCS.
Use the new BigQuery Storage API.
I created a feature request you can upvote to get a CLI tool to download with the Storage API:
Upvotes: 2