Reputation: 1075
My project structure is set up as so:
cloudbuild.yaml
requirements.txt
functions/
folder_a/
test/
main_test.py
main.py
What would I need to specify in my cloudbuild.yaml
to take every newly edited function in functions/
, run their tests, and then sync those functions to Google Cloud Functions? All functions are python37 and use http as their trigger.
Upvotes: 2
Views: 2800
Reputation: 230
Why not re-run the tests and redeploy every time any code changes? No harm in testing that an updated dependency hasn't broken something old or something that you depended on could have changed.
I'm not sure what testing framework you're using but something like
steps:
- name: 'python'
args: ['pip3','install', '-r', 'requirements.txt', '--user']
# This installs your requirements and `--user` makes them persist between steps
- name: 'python'
args: ['python3','pytest', 'functions/folder_a/test/'] #run all tests in the tests folder
# Create a task for each function as shown here: https://cloud.google.com/functions/docs/bestpractices/testing#continuous_testing_and_deployment
- name: 'gcr.io/cloud-builders/gcloud']
id: 'deployMyFunction'
args: ['functions', 'deploy', 'my-function', '--source' , 'functions/folder_a/main.py', '--runtime' , 'python37' ,'--trigger-http']
# Option B: Write some python that iterates and deploys each function, although I can't seem to find the Cloud Functions in the python SDK SPI.
- name: 'python'
args: ['python3','deploy.py']
env:
- 'PROJECT_ID=${_PROJECT_ID}'
Upvotes: 5