Reputation: 43
After building a huge image containing a single file of over 30GB for test purposes, I wasn't able to download the same file after deploying the image into Cloud Run.
Just to be clar, the image does build and runs on Cloud Run, but the specific big file is not available for some reason.
Here's a sample Dockerfile to reproduce this error:
FROM python:3
WORKDIR /app
RUN touch test # downloading this file works fine
RUN dd if=/dev/urandom of=file bs=1M count=32768 # this one takes a while to build and to deploy, and won't be downloadable later on.
EXPOSE 8080
CMD python -m http.server 8080 --bind 0.0.0.0
Trying to download the file through wget returns the following:
wget https://cloud-run-link-here.app/file
--2020-03-03 17:19:16-- https://cloud-run-link-here.app/file
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Resolving cloud-run-link-here.app (bigdocker-7k3mrt42la-uc.a.run.app)... :::::0, 0.0.0.0
Connecting to cloud-run-link-here.app (cloud-run-link-here.app)|:::::0|:443... connected.
HTTP request sent, awaiting response... 500 Internal Server Error
2020-03-03 17:19:17 ERROR 500: Internal Server Error.
Doing the same locally works just fine.
There's no useful info on Cloud Run's logs, as seen in the picture bellow.
Upvotes: 3
Views: 502
Reputation: 21570
The maximum response size for Cloud Run is 32MB and the HTTP server in the Python standard library is not recommended for production use. It's likely that it's not attempting to chunk the extremely large request and failing.
You should consider using a production HTTP server such as gunicorn instead.
Upvotes: 3