Reputation: 3448
I am trying to export a huge table (2,000,000,000 rows, roughly 600GB in size) from BigQuery into a google bucket as a single file. All tools suggested in Google's Documentation are limited in export size and will create multiple files. Is there a pythonic way to do it without needing to hold the entire table in the memory?
Upvotes: 0
Views: 359
Reputation: 208042
While perhaps there are other ways to make it as a script, the recommended solution is to merge the files using Google Storage compose
action.
What you have to do is:
All this can be combined in a cloud Workflow, there is a tutorial here.
Upvotes: 1