Gatmando
Gatmando

Reputation: 2269

Recommended project structure for Python-based GCP projects using both App-Engine and Cloud Functions

I've inherited a GCP project using Python as the primary language. Its my first exposure to GCP and I'm concerned that the project may not be structured properly as far as best-practices are concerned.

The project consists of App-Engine (standard) to expose several HTTP endpoints for use by an web-app, as well as several 'trigger' Cloud Functions which are deployed to handle various situations requiring backend processing, ex: object uploads to a bucket. Currently the project code base contains both the App-Engine code and the Cloud Functions code.

The code structure looks like this:

project/
├── main.py
└── common-ftns/
    ├── __init__.py
    └── initialize-app.py
    └── utils.py
└── cloud-ftns/
    └── cloud_ftn-1.py
    └── cloud_ftn-2.py
└── services/
    └── service-1-routes.py
    └── service-1.py
    └── service-2-routes.py
    └── service-2.py

We are using GCP Cloud Build to deploy the entire solution and everything works great. However my concern is around the common use of main.py for both App-Engine and the Cloud Functions. Both GAE and Cloud Functions seem to require the existence of the root-level main.py file to initialize the app (including Flask) as well as declare the Cloud Function entry points. This bothers me since it seems that Cloud Functions and App-Engine should not be requiring a common starting point, not to mention that the trigger-handling Cloud Functions should not need to have Flask since they are not using it.

My question is this: is this type of structure considered "best-practice" in the GCP/Python world? If not then is there a better way to leverage main.py such that Cloud Functions and GAE do not have to run the exact same start-up script?

Upvotes: 3

Views: 1763

Answers (2)

Dustin Ingram
Dustin Ingram

Reputation: 21520

This project seems structured in an ideal way for sharing a common codebase between an App Engine app and one or more Cloud Functions, without additional overhead of having to manage private sub-dependencies or complicated build steps.

Upvotes: 0

guillaume blaquiere
guillaume blaquiere

Reputation: 75705

You can have a monorepo project and use it in production. When you do this, and if you want to deploy some parts and not all in the same time, this requires more work on the CI script but it works. All depends on your requirements and you way of working.

So, my recommendation is to create subdirectory for each service

  • One for AppEngine service
  • One for function service, and in it, a directory per function
  • No main at the root, only your CI/CD script and other bash script (terraform or other) to deploy

When you deploy, go to the correct directory and perform a deploy of the service. The app.yaml is in the App Engine directory, functions doesn't care about it.

For the functions, each one can have a different requirements.txt file. That's why it's important to separate them into directories. Here again, go in to the correct directory, and deploy the function.

The problem come from the common files to share between the functions and App Engine, your "common" package. There is 2 way for this:

  • Either you build a package and you import it into your dependencies
  • Or you copy the common directory into the correct directory before deploying. -> I perform this most of time. It's easy to perform in Java (with maven), that requires more scripting in Python.

Upvotes: 5

Related Questions