Sam Leurs
Sam Leurs

Reputation: 2010

upgrading postgresl/redis databases without downtime on GCP

I'm creating a web app in react with a nodeJS backend. I'm hosting all this on the Google Cloud Platform. I'm using a postgresql database and a redis database, and because my knowledge of these databases is very little, I'm using the managed options (cloud SQL and cloud memorystore).

These are not the cheapest solutions, but for now, they'll do what I want them to do.

My question now is: I'm using the managed options. Imagine my web app has success and grows bigger, I'll probably want my own managed solution (like a redis cluster on compute engine or a postgresql cluster on compute engine). Will I be able to migrate my managed databases to the compute engine solution without downtime/loss of data?

If things are getting bigger, I'll probably hire someone with more knowledge about postgresql/redis, that's not the problem, the only thing I want to know: is it possible to upgrade from a GCP managed solution to an unmanaged solution on compute engine without loss of data and downtime? I'm do not want loss of data at all, a little downtime should not be the problem.

Upvotes: 1

Views: 1047

Answers (1)

Nikita Zavyalov
Nikita Zavyalov

Reputation: 96

Using the managed solution is, in fact, a better approach for handling databases. GCP takes over updates, management and maintenance of the database and provides reliable tools for backup and scaling.

But to answer your question, yes it is possible to migrate with a minimum downtime. You would need to configure main/worker or master/slave (deprecated terminology) with synchronous replication. After that you can switch your database to worker (which is in Compute Engine) and make it your primary database. This would give basically minimal possible downtime.

Upvotes: 3

Related Questions