Reputation: 17
Any recommendations on an ETL process that would be good to move data from on Bigquery db to another Bigquery db? Thanks in advance!
Upvotes: 0
Views: 125
Reputation: 1916
Using Python client Library for example you can use something like the following (similar example than in this official link):
from google.cloud import bigquery
client = bigquery.Client()
project = '<your-project-id>'
source_dataset = '<your-source-dataset>'
source_table = '<your-source-table>'
dest_dataset = '<your-dest-dataset>'
dest_table = '<your-dest-table>'
source_dataset_ref = client.dataset(source_dataset, project=project)
source_table_ref = source_dataset_ref.table(source_table)
dest_table_ref = client.dataset(dest_dataset).table(dest_table)
#Copy Table
job = client.copy_table(source_table_ref,
dest_table_ref,
location="US")
#Delete original table
table_ref = client.dataset(source_dataset).table(source_table)
client.delete_table(table_ref)
Take into account the following (from official docs here):
When copying a table, the datasets containing the source table and destination table must reside in the same location.
After copying you can delete the original table:
Upvotes: 0
Reputation: 1099
You can not move. But you can copy and delete after To copy you can actually use bq copy command for each dataset you want to move
And script like this
#!/bin/sh
export SOURCE_DATASET="<bq_project>:<bq_dataset>"
export DEST_PREFIX="<bq_target_project>:<bq_target_dataset>."
for f in `bq ls -n 10000 $SOURCE_DATASET |grep TABLE | awk '{print $1}'`
do
bq --nosync cp $SOURCE_DATASET.$f $DEST_PREFIX$f
done
Upvotes: 1