Reputation: 12095
I am trying to use Cloud Functions for Firebase to build an API that talks with a Google Cloud SQL (PostgreSQL) instance.
I am using HTTP(S) trigger.
When I white-list my desktop's IP address, I can connect to the Cloud SQL with the function's node.js code from my local machine. But when I deploy, I can't connect, and I can't figure out the HOST IP address of Firebase Function's server, to white-list.
How do you talk to Google Cloud SQL from Cloud Functions for Firebase?
Thanks!
// Code Sample, of what's working on Localhost.
var functions = require('firebase-functions');
var pg = require('pg');
var pgConfig = {
user: functions.config().pg.user,
database: functions.config().pg.database,
password: functions.config().pg.password,
host: functions.config().pg.host
}
exports.helloSql = functions.https.onRequest((request, response) => {
console.log('connecting...');
try {
client.connect(function(err) {
if (err) throw err;
console.log('connection success');
console.log('querying...');
client.query('SELECT * FROM guestbook;', function(err, result){
if (err) throw err;
console.log('querying success.');
console.log('Results: ', result);
console.log('Ending...');
client.end(function(err){
if (err) throw err;
console.log('End success.');
response.send(result);
});
});
});
} catch(er) {
console.error(er.stack)
response.status(500).send(er);
}
});
Upvotes: 43
Views: 34189
Reputation: 8066
CONNECTING FROM GOOGLE CLOUD FUNCTIONS TO CLOUD SQL USING TCP AND UNIX DOMAIN SOCKETS 2020
1.Create a new project
gcloud projects create gcf-to-sql
gcloud config set project gcf-to-sql
gcloud projects describe gcf-to-sql
2.Enable billing on you project: https://cloud.google.com/billing/docs/how-to/modify-project
3.Set the compute project-info metadata:
gcloud compute project-info describe --project gcf-to-sql
#Enable the Api, and you can check that default-region,google-compute-default-zone are not set. Set the metadata.
gcloud compute project-info add-metadata --metadata google-compute-default-region=europe-west2,google-compute-default-zone=europe-west2-b
4.Enable Service Networking Api:
gcloud services list --available
gcloud services enable servicenetworking.googleapis.com
5.Create 2 cloud sql instances, (one with internall ip and one with public ip)- https://cloud.google.com/sql/docs/mysql/create-instance:
6.a Cloud Sql Instance with external ip:
#Create the sql instance in the
gcloud --project=con-ae-to-sql beta sql instances create database-external --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-external --password root
#Create a user
gcloud sql users create user_name --host=% --instance=database-external --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-external
gcloud sql databases list --instance=database-external
6.b Cloud Sql Instance with internal ip:
i.#Create a private connection to Google so that the VM instances in the default VPC network can use private services access to reach Google services that support it.
gcloud compute addresses create google-managed-services-my-network --global --purpose=VPC_PEERING --prefix-length=16 --description="peering range for Google" --network=default --project=con-ae-to-sql
gcloud services vpc-peerings connect --service=servicenetworking.googleapis.com --ranges=google-managed-services-my-network --network=default --project=con-ae-to-sql
#Check whether the operation was successful.
gcloud services vpc-peerings operations describe --name=operations/pssn.dacc3510-ebc6-40bd-a07b-8c79c1f4fa9a
#Listing private connections
gcloud services vpc-peerings list --network=default --project=con-ae-to-sql
ii.Create the instance:
gcloud --project=con-ae-to-sql beta sql instances create database-ipinternal --network=default --no-assign-ip --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-ipinternal --password root
#Create a user
gcloud sql users create user_name --host=% --instance=database-ipinternal --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-ipinternal
gcloud sql databases list --instance=database-ipinternal
gcloud sql instances list
gcloud sql instances describe database-external
gcloud sql instances describe database-ipinternal
#Remember the instances connectionName
OK, so we have two mysql instances, we will connect from Google Cloud Functions to database-ipinternal using Serverless Access and TCP, and from Google Cloud Functions to database-external using unix domain socket.
7.Enable the Cloud SQL Admin API
gcloud services list --available
gcloud services enable sqladmin.googleapis.com
Note: By default, Cloud Functions does not support connecting to the Cloud SQL instance using TCP. Your code should not try to access the instance using an IP address (such as 127.0.0.1 or 172.17.0.1) unless you have configured Serverless VPC Access.
8.a Ensure the Serverless VPC Access API is enabled for your project:
gcloud services enable vpcaccess.googleapis.com
8.b Create a connector:
gcloud compute networks vpc-access connectors create serverless-connector --network default --region europe-west2 --range 10.10.0.0/28
#Verify that your connector is in the READY state before using it
gcloud compute networks vpc-access connectors describe serverless-connector --region europe-west2
9.Create a service account for your cloud function. Ensure that the service account for your service has the following IAM roles: Cloud SQL Client, and for connecting from App Engine Standard to Cloud Sql on internal ip we need also the role Compute Network User.
gcloud iam service-accounts create cloud-function-to-sql
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:[email protected] --role roles/cloudsql.client
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:[email protected] --role roles/compute.networkUser
Now that I configured the set up
1. Connect from Google Cloud Functions to Cloud Sql using Tcp and unix domanin socket
cd app-engine-standard/
ls
#main.py requirements.txt
cat requirements.txt
sqlalchemy
pymysql
cat main.py
import pymysql
from sqlalchemy import create_engine
def gcf_to_sql(request):
engine_tcp = create_engine('mysql+pymysql://user_name:[email protected]:3306')
existing_databases_tcp = engine_tcp.execute("SHOW DATABASES;")
con_tcp = "Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => " + str([d[0] for d in existing_databases_tcp]).strip('[]') + "\n"
engine_unix_socket = create_engine('mysql+pymysql://user_name:user_password@/user_database?unix_socket=/cloudsql/gcf-to-sql:europe-west2:database-external')
existing_databases_unix_socket = engine_unix_socket.execute("SHOW DATABASES;")
con_unix_socket = "Connecting from Google Cloud Function to Cloud SQL using Unix Sockets: tables in sys database: => " + str([d[0] for d in existing_databases_unix_socket]).strip('[]') + "\n"
return con_tcp + con_unix_socket
2.Deploy the cloud function:
gcloud beta functions deploy gcf_to_sql --runtime python37 --region europe-west2 --vpc-connector projects/gcf-to-sql/locations/europe-west2/connectors/serverless-connector --trigger-http
3.Go to Cloud Function, choose gcf-to-sql
, Testing, TEST THE FUNCTION:
#Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'
#Connecting from Google Cloud Function to Cloud SQL using Unix Sockets: tables in sys database: => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'
SUCCESS!
Upvotes: 5
Reputation: 1969
New Answer:
See other answers, it's now officially supported. https://cloud.google.com/functions/docs/sql
Old Answer:
It's not currently possible. It is however a feature request on the issue tracker #36388165:
Connecting to Cloud SQL from Cloud Functions is currently not supported, as the UNIX socket does not exist (causing ENOENT) and there is no defined IP range to whitelist (causing ETIMEDOUT). One possibility is to whitelist 0.0.0.0/0 from the Cloud SQL instance but this is not recommended for security reasons.
If this is an important feature for you I would suggest you visit the issuetracker and star the feature request to help it gain popularity.
Upvotes: 14
Reputation: 823
I found answer in further discussion of #36388165.
disclaimer: this does not seem to be announced officially, so may change afterward. also I only test in mysql. but nature of this solution, I think same way should work as in pg module (it seems to accept domain socket path as host parameter)
EDIT(2017/12/7): google seems to provide official early access, and same method still works.
EDIT(2018/07/04): it seems that there is someone just copy-and-paste my example code and get into trouble. as google says, you should use connection pool to avoid sql connection leak. (it causes ECONNREFUSE) so I change example code a bit.
EDIT(2019/04/04): in below example, using $DBNAME as spanner instance name is confusing, I modify example.
in https://issuetracker.google.com/issues/36388165#comment44 google guy says cloud function instance can talk with cloud sql through domain socket in special path '/cloudsql/$PROJECT_ID:$REGION:$DBNAME'.
I actually can connect and operate cloud SQL from below cloud function code.
const mysql = require('mysql');
const pool = mysql.createPool({
connectionLimit : 1,
socketPath: '/cloudsql/' + '$PROJECT_ID:$REGION:$SPANNER_INSTANCE_NAME',
user: '$USER',
password: '$PASS',
database: '$DATABASE'
});
exports.handler = function handler(req, res) {
//using pool instead of creating connection with function call
pool.query(`SELECT * FROM table where id = ?`,
req.body.id, function (e, results) {
//made reply here
});
};
I hope this would be help for those cannot wait for official announce from google.
Upvotes: 36
Reputation: 3313
there's now official documentation for this, still in Beta though as at July 2018
https://cloud.google.com/functions/docs/sql
Upvotes: 10
Reputation: 1880
You can also authorize Firebase IP addresses range since we don't really know which IP address Firebase use externally.
I've experimented on it. Google Cloud SQL DOES NOT USE internal IP addresses. So, you CAN NOT use 10.128.0.0/20
to allow internal IP addresses for your Google Cloud SQL.
Answer
So from the console, go to Google Cloud SQL > Instance > Authorization
, you can add:
151.101.0.0/17
Which will allow you 151.101.0.0
to 151.101.127.255
IP address range, wherein the Firebase server domain is currently 151.101.1.195
and 151.101.65.195
.
I'm not sure if this IP address is ever going to change.
Also, make sure that your Cloud SQL Database is using us-central
zone. Firebase seems to be available in us-central
.
Upvotes: 0
Reputation: 38378
Find your database region and instance name on GCP > SQL > Instances page:
Save your database password into Firebase environment by running:
$ firebase functions:config:set \
db.user="<username>" \
db.password="<password>" \
db.database="<database>"
Then...
db.js
const { Pool } = require('pg');
const { config } = require('firebase-functions');
const project = process.env.GCP_PROJECT;
const region = 'europe-west1';
const instance = 'db';
module.exports = new Pool({
max: 1,
host: `/cloudsql/${project}:${region}:${instance}`,
...config().db
});
someFunction.js
const { https } = require('firebase-functions');
const db = require('./db');
module.exports = https.onRequest((req, res) =>
db
.query('SELECT version()')
.then(({ rows: [{ version }]) => {
res.send(version);
}));
See also https://stackoverflow.com/a/48825037/82686 (using modern JavaScript syntax via Babel)
Upvotes: 15
Reputation: 12095
Cloud Functions - Supported Services - I don't see Cloud SQL on this list, so perhaps it's not supported yet.
Upvotes: 0