Teejay
Teejay

Reputation: 111

Downloading table data as CSV to local machine in Big query

def downloadGbqToCsv(self,table_name,csv):

        credentials = GoogleCredentials.get_application_default()
    bigquery = discovery.build('bigquery', 'v2', credentials=credentials)

        job_data = {
        'sourceTable': {
                    'projectId': self.project_id,
                    'datasetId': self.dataset_id,
                    'tableId': table_name,
                },
                'destinationUris': 'path/to/download',
                'destinationFormat': 'CSV',
                'compression': 'NONE'
        }


        start = time.time()
        job_id = 'job_%d' % start
        # Create the job.

        result = bigquery.jobs().insert(projectId=self.project_id,body=job_data).execute()

        return result

How can I download the table data in CSV to my local machine?

Upvotes: 1

Views: 2856

Answers (2)

Icarus
Icarus

Reputation: 1463

Executing a query and store in local machine is quite indirect in Big Query, I've packaged the code and hope it helps.

Packages version:
google-cloud (0.27.0)
google-cloud-bigquery (0.26.0)
google-cloud-storage (1.3.1)

https://github.com/IcarusSO/bigQueryExporter

Upvotes: 0

Michael Sheldon
Michael Sheldon

Reputation: 2057

Run an Extract Job to export your table to Google Cloud Storage in your desired format, then you can use any GCS tool to download the files to your local file system.

Here's an example extract job configuration: https://cloud.google.com/bigquery/exporting-data-from-bigquery#extractconfig

I use gsutil to copy files from GCS to my local drive.

If you want to copy the table directly to your local drive, you will have to use tabledata.list to list the table row-by-row, and convert it to your output format locally. The bq command head works this way, for ex:

bq --format=csv head -n 100 project:dataset.table

But that only downloads the first 100 lines. The bq client doesn't support listing an entire table that way, as it's too slow. If your data sizes are small, however, it would work for you.

Upvotes: 1

Related Questions