user1222324562
user1222324562

Reputation: 1075

Structuring Google Cloud Platform project

I'm working on a project that has many small tasks. Some of these tasks are related and require overlapping apis.

task_1/
    main.py
task_2/
    main.py
apis/
    api_1/
    api_2/
    api_3/
test/
    test_api_1.py
    test_api_2.py
    test_task_1.py
    test_task_2.py
    test_task_3.py

For example, task_1 needs api_1 and api_3, while task_2 needs api_1 and api_2. At first I tried using Google Cloud Functions to execute these tasks, but I ran into the issue that GCF needs local dependencies installed in the same folder as the task. This would mean duplicating the code from api_1 into task_1. Further, local testing would become more complicated because of the way GCF does imports (opposed to .mylocalpackage.myscript):

You can then use code from the local dependency, mylocalpackage:

from mylocalpackage.myscript import foo

Is there a way to structure my codebase to enable easier deployment of GCF? Due to my requirements, I cannot deploy each API as its own GCF. Will Google Cloud Run remedy my issues?

Thanks!

Upvotes: 2

Views: 104

Answers (1)

Grayside
Grayside

Reputation: 4194

To use Cloud Functions for this, you will need to arrange your code in such a way that all the code a function depends on is present within that function's directory at the time of deployment. This might be done as a custom build/packaging step to move files around.

To use Cloud Run for this, you need to create a minimal HTTP webserver to route requests to each of your "functions". This might be best done by creating a path for each function you want to support. At that point, you've recreated a traditional web service with multiple resources.

If these tasks were meant as Background Functions, you can wire up Pub/Sub Push integration.

Upvotes: 2

Related Questions