codecian
codecian

Reputation: 92

BigQuery Table Data Export

I am trying to export data from BigQuery Table using python api. Table contains 1 to 4 million of rows. So I have kept maxResults parameter to maximum i.e. 100000 and then paging through. But problem is that in One page I am getting 2652 rows only so number of paging is too much. Can anyone provide reason for this or solution to deal. Format is JSON. Or can I export data into CSV format without using GCS?

I tried by inserting job and keeping allowLargeResults =true, but the result remain same.

Below is my query body :

queryData = {'query':query,
                     'maxResults':100000,
                     'timeoutMs':'130000'}

Thanks in advance.

Upvotes: 3

Views: 1730

Answers (2)

Timmy
Timmy

Reputation: 251

You can try to export data from table without using GCS by using bq command line tool https://cloud.google.com/bigquery/bq-command-line-tool like this:

bq --format=prettyjson query --n=10000000 "SELECT * from publicdata:samples.shakespeare"

You can use --format=json depending on your needs as well.

Upvotes: 3

Mikhail Berlyant
Mikhail Berlyant

Reputation: 172944

Actual page size is determined not by row count, but rather by size of those rows in given page. I think it is something around 10MB
User can alsoset maxResults to limit rows in page in addition to above criteria

Upvotes: 1

Related Questions