Reputation: 524
I'm trying to deploy a deep learning-based app for image processing that uses user validation and Postgres database for storing the results. The following is the desired architecture for my app.
In order to keep it free for the long run time, I decided to deploy it using CloudRun serverless service within GCP. The following shell script is used to deploy it, which basically builds the docker image on the Docker registry and then deploys it as a service.
# Build container image
gcloud builds submit --tag gcr.io/$(gcloud config get-value project)/$APP_NAME
# Deploying to Cloud Run
gcloud run deploy ${SERVICE_NAME} --image gcr.io/$(gcloud config get-value project)/$APP_NAME --platform=managed --region=${REGION} --allow-unauthenticated
#Update for CloudSQL connection
gcloud run services update ${SERVICE_NAME} \
--platform=managed \
--region=${REGION}\
--add-cloudsql-instances $INSTANCE_CONNECTION_NAME \
--set-env-vars CLOUD_SQL_CONNECTION_NAME=$INSTANCE_CONNECTION_NAME,DB_USER=$DB_USER,DB_PASS=$DB_PASS,DB_NAME=$DB_NAME
Once the service has been built the next step could be setting up the keycloak container for users validation as shown below.
- image: >-
quay.io/keycloak/keycloak:latest
ports:
- name: tcp
containerPort: 8443
env:
- name: DB_VENDOR
value: postgres
- name: DB_ADDR
value: <IP_ADDRESS>
- name: DB_DATABASE
value: <DB_NAME>
- name: DB_SCHEMA
value: public
- name: DB_USER
value: postgres
- name: DB_PASSWORD
value: postgres
- name: KEYCLOAK_USER
value: admin
- name: KEYCLOAK_PASSWORD
value: admin
However, this doesn't seem to work for Cloud Run configuration. Did anyone tried to achive something similar?
Upvotes: 2
Views: 1285
Reputation: 81464
Google Cloud Run supports one container per service. You can run almost anything you want inside one container. To run two containers requires two services.
Upvotes: 2