Reputation: 7004
To deploy a single function for a single trigger event we can follow the instructions as outlined in the documentation on deploying Google Cloud Functions:
gcloud functions deploy NAME --runtime RUNTIME TRIGGER [FLAGS...]
It takes on average 30s-2m to deploy, which is fine and reasonable.
However, I was wondering if its possible to write a script (e.g. in python) to deploy multiple functions at once?
e.g. :
//somefile.py
gcloud functions deploy function_1 --runtime RUNTIME TRIGGER [FLAGS...]
gcloud functions deploy function_2 --runtime RUNTIME TRIGGER [FLAGS...]
Upvotes: 1
Views: 3135
Reputation: 2950
I really like to use the invoke library for problems like this. In particular, it is well suited for running bash commands (e.g. gcloud
) within a Python script without mucking about in subprocess
.
In your case, you could make a tasks.py
file that looks like
from invoke import task
@task
def deploy_cloud_functions(c):
c.run('gcloud functions deploy function_1 --runtime RUNTIME TRIGGER [FLAGS...]')
c.run('gcloud functions deploy function_2 --runtime RUNTIME TRIGGER [FLAGS...]')
and then run it by calling
invoke deploy-cloud-functions
Note that if you name your function
deploy_cloud_functions
you have to call it using:invoke deploy-cloud-functions
(note the -). You can find a list of current available tasks in your directory usinginvoke --list
You can also parallelize it using the threading library (though I haven't tested using it within invoke myself). It will definitely make for ugly output in the console though. I.e.
from threading import Thread
from invoke import task
@task
def deploy_cloud_functions(c):
Thread(lambda x:
c.run('gcloud functions deploy function_1 --runtime RUNTIME TRIGGER [FLAGS...]')
).start()
Thread(lambda x:
c.run('gcloud functions deploy function_2 --runtime RUNTIME TRIGGER [FLAGS...]')
).start()
Upvotes: 2
Reputation: 3192
If you don't want to just use a python script to do calls to the gcloud command, since it is the same as doing a bash script, you can use the Cloud Functions API Client Library for Python.
What this library does, is create and execute HTTP calls to the Cloud Functions API. You can check Cloud Functions REST reference to see how these calls are structured, and how to build them.
For example, I did a quick example to test this API library, to list the functions running in my project:
import httplib2
import pprint
from googleapiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
credentials = ServiceAccountCredentials.from_json_keyfile_name(
"key.json",
scopes="https://www.googleapis.com/auth/cloud-platform")
http = httplib2.Http()
http = credentials.authorize(http)
service = build("cloudfunctions", "v1", http=http)
operation = service.projects().locations().functions().list(parent='projects/wave16-joan/locations/europe-west1')
pprint.pprint(operation)
You will have to install the modules oauth2client
, google-api-python-client
and httplib2
. As you can see, you will need to create a service account in order to execute the REST API calls, which needs "https://www.googleapis.com/auth/cloud-platform" scopes to create the CF. I created a service account with project/editor
permissions myself, which I believe that are the required roles to create CFs.
Finally, to execute this script, you can just do python <script_name>.py
Now, since you want to create multiple functions (see how this API call needs to be structured here), the service to call should be the following, instead:
operation = service.projects().locations().functions().create(
location='projects/wave16-joan/locations/europe-west1',
body={
"name":"...",
"entryPoint":"..."
"httpsTrigger": {
"url":"..."
}
}
)
You will have to populate the body
of the request with the some of the parameters listed here. For example, the "name"
key should read:
"name":"projects/YOUR_PROJECT/locations/YOUR_PROJECT_LOCATION/functions/FUNCTION_NAME"
As a side note, most of the body parameters listed in the previous documentation are optional, but you will require the name, entryPoint, source, trigger, etc.
Of course this requires more work than creating a bash script, but the result is more portable and reliable, and it will allow you to create multiple operations to deploy multiple functions in the same way.
Upvotes: 1