Boubacar Traoré
Boubacar Traoré

Reputation: 359

Launch multiple server apps in localhost with different port numbers within the same container in Cloud Run

I'm creating a webapp with 2 main services: Flask and Chainlit.

When I launch my webapp locally on localhost:8080, on my landing page, when I click on the "chat" button, it redirects me to my second service chainlit, which is an AI chatbot (there are many other functionalities in the app). Here's how I launch my two services in production:

Flask: poetry run gunicorn -b 0.0.0.0:8080 src.app.app:app

Chainlit: poetry run chainlit run --headless src/chatbot.py

When I build my docker image and launch my container locally, everything works perfectly with the 2 ports exposed. But it doesn't work on Cloud Run. Here are the contents of my files :


app.py

from flask import Flask, redirect, render_template

app = Flask(__name__)


@app.route("/")
def home():
    return render_template("index.html")


@app.route("/chat")
def chat():
    return redirect("http://localhost:8000")


if __name__ == "__main__":
    app.run(host="0.0.0.0", port=8080)

Dockerfile

FROM ollama/ollama

ENV DEBIAN_FRONTEND=noninteractive \
    PATH="/root/.local/bin:$PATH"

ENV HOST 0.0.0.0

# Update package list and install pipx
RUN apt-get update -y && \
    apt-get install -y ffmpeg && \
    apt-get install -y pipx && \
    pipx install poetry==1.7.1 && \
    rm -rf /var/lib/apt/lists/*


# Set the working directory
WORKDIR /app

# Copy the current directory contents into the container
COPY . .

# Install dependencies with poetry
RUN poetry install --no-root --no-cache --without explo && \
    chmod +x pull_ollama_models_and_launch_app_servers.sh

EXPOSE 8080 8000

ENTRYPOINT ["./pull_ollama_models_and_launch_app_servers.sh"]


#!/bin/bash

# Start Ollama in the background.
ollama serve &

# Pause for Ollama to start.
sleep 5

# Pause before server launch
echo "🔵 Launching Servers (Flask and Chainlit) ..."
sleep 5

# Start Chainlit
poetry run chainlit run --headless src/chatbot.py &

# start flask landing page
poetry run gunicorn -b 0.0.0.0:8080 src.app.app:app &


# Wait for both background processes to exit
wait


My problem is that on Cloud Run, the landing page (index.html) works fine when I access the application via the generated URL. However, the chat part doesn't work at all (This site can’t be reached error). Default STARTUP TCP probe succeeded after 1 attempt for container "xxx" on port 8080 is the only stack-trace available on the log, no trace of chainlit service running on port 8000.

Can anyone help me?

Upvotes: 0

Views: 234

Answers (1)

Paku
Paku

Reputation: 797

Cloud Run always exposes 1 port. If you want to have another port available outside your container, you need to deploy it as a cloud run sidecar.

EXPOSE 8080 8000 will be overwritten as EXPOSE $PORT by cloud run

Upvotes: 1

Related Questions