user11366694
user11366694

Reputation: 141

How to extract Bigquery table for every given length of rows to csv file in Google Storage?

For example, I have a table in bigquery with 10 million rows, I want to extract this table to Google Storage every 100 thousand rows. To make it clear, I want 100 csv files and each one have 100k distinct rows in the bigquery table.

bq extract --noprint_header dataeset.abigtable gs://bucket/output/*.csv

With the code above entered into gcloud shell, the table will be splitted into 10 or so files in google storage. However, I have no control of how many rows in each of the file. How could I control it?

Upvotes: 0

Views: 1303

Answers (1)

Christopher
Christopher

Reputation: 941

There is no flag which you can use to make your use-case possible. If you think this feature would be helpful, you can file this as a feature request, although a handful of stars is needed to get this feature request to be prioritized by the BigQuery team. To make it more meaningful, there should be a business use-case as to why you need such (e.g why would you need the exported csv file to have 100k rows each?).

Upvotes: 1

Related Questions