Reputation: 1347
My team has a requirement that we be able to retrieve a backup of our database (hosted on Google Cloud SQL) and restore that database to a locally hosted instance of MySQL.
I know that Google Cloud SQL has the ability to schedule backups, but these don't appear to be available to download anywhere.
I also know that we are able to "export" our database to Google Cloud Storage, but we'd like to be able to schedule the "export".
The end goal here is to execute the following steps in some sort of an admin script:
Any ideas?
Upvotes: 13
Views: 7046
Reputation: 563
gcloud sdk commands now provide import/export functionality:
gcloud sql export sql <DATABASE_INSTANCE> \
gs://<CLOUD_STORAGE_BUCKET>/cloudsql/export.sql.gz \
--database <DATABASE_NAME>
This export can be downloaded using gsutil
. It can also be imported using mysqlimport
Upvotes: 14
Reputation: 201
If you want to download a backup (manual or automated), you can launch another CloudSQL instance and then:
Upvotes: 2
Reputation: 3715
That's the problem I've encountered and my solution was:
sql-backuper
), download access key for it in JSONgcloud auth activate-service-account sql-backuper@project-name-123.iam.gserviceaccount.com --key-file /home/backuper/gcloud-service-account.json
(gcloud auth documentation)gcloud sql instances export [sql-instance-name] gs://[bucket-name]/[file-name].gz --database [your-db-name]
(gcloud sql documentation)
and
gsutil cp gs://[bucket-name]/[file-name].gz [local-file-name].gz
(gsutil cp documentation)Upvotes: 6
Reputation: 3589
Note that you can now trigger an Export
operation using the Cloud SQL REST API.
So your admin script can do that and then download the backup from Cloud Storage (You'll need to wait until the export operation finishes though).
Upvotes: 3
Reputation: 659
Sorry, but Cloud SQL does not have this functionality currently. We'd like to make this easier in the future. In the meantime, you could use Selenium (or some other UI scripting framework) in combination with a cron job.
Upvotes: 2