Reputation: 95
I'm looking to give Cloud Build access to a PostgreSQL database during the steps because it's part of an integration testing from the Python application I'm running. Any suggestions on how to handle this authorization without exposing the database to the world?
Upvotes: 4
Views: 5092
Reputation: 1268
You can do this using a Private Pool where you define the network CIDR to be used at build time; see https://cloud.google.com/build/docs/private-pools/private-pools-overview to learn more.
(Previous answer follows, which I've left in place for transparency around history.)
At this time, you would need to whitelist all of the GCE public IP address ranges -- which effectively exposes your database to the world. (So don't do that!)
However, at Google Next we announced and demoed a coming Alpha release that will enable you to run GCB workloads in a hybrid VPC world with access to protected (on-prem) resources. As part of that Alpha, you could whitelist internal-only addresses to achieve your goal securely.
You can watch for a public announcement in our release notes.
Upvotes: 1
Reputation: 353
Now you can use IAP (Identity-Aware Proxy) TCP forwarding feature.
I don't know if this is still helpful or not but I run into a similar situation a while ago and I was able fix it like this.
steps:
- name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
entrypoint: /bin/sh
args:
- '-c'
- |
gcloud compute start-iap-tunnel sql-vm 5555 \
--local-host-port=localhost:5555 \
--zone=us-west1-a & sleep 5 && python echo_client.py
I also wrote a blog post about this. Check it here hodo.dev
Upvotes: 0