Mirodinho
Mirodinho

Reputation: 1321

Automated BigTable backups

A BigTable table can be backed up through GCP for up to 30 days. (https://cloud.google.com/bigtable/docs/backups)

Is it possible to have a custom automatic backup policy?

i.e. trigger automatic backups every X days & keep up to 3 copies at a time.

Upvotes: 4

Views: 887

Answers (2)

john_mwood
john_mwood

Reputation: 56

Here is another thought on a solution:

Instead of using three GCP Products, if you are already using k8s or GKE you can replace all this functionality with a k8s CronJob. Put the BigTable API calls in a container and deploy it on a schedule using the CronJob.

In my opinion, it is a simpler solution if you are already using kubernetes.

Upvotes: 2

Donnald Cucharo
Donnald Cucharo

Reputation: 4126

As mentioned in the comment, the link provides a solution which involves the use of the following GCP Products:

  • Cloud Scheduler: trigger tasks with a cron-based schedule

  • Cloud Pub/Sub: pass the message request from Cloud Scheduler to Cloud Functions

  • Cloud Functions: initiate an operation for creating a Cloud Bigtable backup

  • Cloud Logging and Monitoring (optional).

Full guide can also be seen on GitHub.

This is a good solution since you have a certain requirement that should be done with client libraries, because Big Table doesn't have an API that sets 3 copies at a time.

For normal use cases however, such as triggering automatic backups every X days, there's another solution such as calling the backups.create directly by creating a Cloud Scheduler with HTTP similar to what's done in this answer.

Upvotes: 4

Related Questions