Reputation: 475
I am trying to copy data from one Bigtable table to another Bigtable table but not finding any direct way to do it. There is an option to copy data from a Bigtable table to Google Storage and then back to Bigtable from the Storage file, but it is time taking process. So can someone help me to suggest anything?
Upvotes: 2
Views: 3326
Reputation: 608
In case anyone needs this in the future, a slightly easier way to create a copy of a Bigtable table is to use Bigtable backups. You can create one for your table in Cloud UI or via gcloud
CLI and then restore it into a new table. See Cloud documentation for more details: https://cloud.google.com/bigtable/docs/managing-backups
Upvotes: 1
Reputation: 219
It seems that indeed you cannot copy directly a table between BigTable instances. However, you can write a script to use gcloud commands to automatize the process of exporting your table to Cloud Storage and then importing it to the destination BigTable instance.
You can find more information on how to write the gcloud commands for this process here:
1) Exporting to Cloud Storage: https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#running-the-cloud-bigtable-to-cloud-storage-avro-file-template
2) Importing to BigTable: https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#running-the-cloud-storage-avro-file-to-cloud-bigtable-template
If you are interested on making this copy for the sake of making a backup you might be interesting on enabling BigTable replication instead: https://cloud.google.com/bigtable/docs/replication-overview
Upvotes: 3