Reputation: 2300
This is what I'm currently doing. It's undesirable because it replicates identical lib/ folder with shared functions:
/project/
└── /cloudfunctions/
├── /functionA/
│ ├── main.py
│ └── /lib/
└── /functionB/
├── main.py
└── /lib/
How do I organize or deploy functions so that the project structure can be more like this?
/project/
└── /cloudfunctions/
├── /functionA/
│ └── main.py
├── /functionB/
│ └── main.py
└── /lib/
To clarify more, I'm wondering how there isn't an --include-dependencies
flag for gcloud functions deploy
and I'm wondering what the best practice is to include a shared library like in the second folder structure.
For now this is my janky hack in deploy.sh
rm -rf lib
cp -r ../lib ./lib
gcloud functions deploy...
Upvotes: 3
Views: 1838
Reputation: 21520
You can have multiple functions in a single directory. A common structure would be as follows:
.
├── common
│ ├── module1.py
│ └── module2.py
├── main.py
└── requirements.txt
Where main.py
contains both functions:
from common import module1, module2
def cloudfunction1(request):
...
def cloudfunction2(request):
...
And you deploy those functions either directly by name:
$ gcloud beta functions deploy cloudfunction1 --runtime python37 --trigger-http
$ gcloud beta functions deploy cloudfunction2 --runtime python37 --trigger-http
Or by entrypoint:
$ gcloud beta functions deploy foo --runtime python37 --entry-point cloudfunction1 --trigger-http
$ gcloud beta functions deploy bar --runtime python37 --entry-point cloudfunction2 --trigger-http
Upvotes: 7