Reputation: 367
I have have two services deployed onto Cloud Run from docker containers. One is my python flask app using gunicorn. The other is my celery service. I have connected my celery service to a Google cloud Redis memory store instance using a serverless VPC connector.
However, my celery tasks don't seem to be registering or executing when called by my Flask service. Is it possible to manage celery tasks this way? I've seen numerous other posts recommending the Google Task Queue. However if it is possible, I rather stick to using celery.
Upvotes: 5
Views: 5102
Reputation: 7779
A partial solution is to use two containers in one cloud run:
Compared to the solution with two process in the same container, the cloud run can detect if the celery process is exited and restart the cloud run instance.
A possible improvement is to have the server web doing a celery ping when receiving a HTTP startup probe request.
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
annotations:
run.googleapis.com/ingress: internal
run.googleapis.com/ingress-status: internal
labels:
cloud.googleapis.com/location: us-west2
name: worker
spec:
template:
metadata:
annotations:
autoscaling.knative.dev/maxScale: '1'
autoscaling.knative.dev/minScale: '1'
run.googleapis.com/startupProbeType: Default
spec:
containerConcurrency: 20
containers:
- command:
- "python"
- "-m"
- "http.server"
- "8080"
name: dummy
image: python:3.11
ports:
- containerPort: 8080
name: http1
resources:
limits:
cpu: 100m
memory: 200Mi
startupProbe:
initialDelaySeconds: 10
timeoutSeconds: 5
periodSeconds: 240
failureThreshold: 1
httpGet:
path: /
port: 8080
- command:
- /run_worker.sh
Upvotes: 1
Reputation: 430
I'm not 100% sure now but this seems to work, isn't exactly Flask but the idea should be pretty similar to implement:
here some snippets:
Dockerfile
...
COPY ./entrypoint.sh /entrypoint.sh
RUN sudo chmod +x /entrypoint.sh
EXPOSE 8000
CMD ["bash", "/entrypoint.sh"]
entrypoint.sh
elif [ "$ENV_TYPE" == "celery-worker" ];
then
echo "Running celery worker command with dummy server for CloudRun"
pipenv run celery -A etl worker -l info &
pipenv run python -m http.server --directory /dummy 8000
elif [ "$ENV_TYPE" == "celery-beat" ];
then
echo "Running celery beat command with dummy server for CloudRun"
pipenv run celery -A etl beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler &
pipenv run python -m http.server --directory /dummy 8000
CloudRun setup
Comments:
Hope it helps somebody.
Upvotes: 0