Reputation: 5877
My scenario started off with needing private IP access for Cloud SQL from a build server running as a compute engine instance in the same project. For some context, my build server connects to Cloud SQL to execute DDL SQL statements (automatic SQL migrations based on the diff between init.sql
that is part of my repository and the schema on the targeted database) if the deployed artifact is expecting new rows / tables. I accomplished this using a VPC peering connection between the VPC GCP uses for Cloud SQL databases and the VPC I explicitly created for my compute engine instances.
My scenario eventually evolved into moving the build server into a separate project. Because of lack of transitive peering, I created a shared VPC where my Cloud SQL project is considered the host project, and my build server project is considered the service project. This is all working great.
I had moved the build server into a separate project with the eventual goal of creating a third project for my staging environment. I was then planning to enable private IP connections between my build server and the two other projects. However, now that I'm trying to add my staging environment, I found that this is impossible because according to the docs:
You can create and use multiple host projects; however, each service project can only be attached to a single host project.
Whereas I wanted both the production and staging projects to be host projects for my build server service project.
Is there anyway to meet these two requirements?
Am also open to critiques/feedback about the setup I've outlined here, because at end of day I've only derived above requirements from what I know and have set up so far.
Upvotes: 1
Views: 325
Reputation: 75745
The solution is to have the same VPC for the build server and the 2 databases (prod and staging). The problem is the non-isolation of the environments that can lead to issues.
I have 2 propositions to change/adapt the design:
Upvotes: 3