Reputation: 3146
So, I've got some Python code running inside a Docker container. I started up my local env using Google's gcloud script. I'm seeing basic access style logs and health check info, but I'm not sure how I can pass through log messages I'm writing from my Python app to the console. Is there a parameter I can set to accomplish this with my gcloud script or is there something I can set in the Dockerfile that can help?
Upvotes: 16
Views: 17850
Reputation: 9735
(Answer based on the comments)
You don't need to know the container ID if you wrap the app into docker-compose. Just add docker-compose.yml
alongside your Dockerfile
. It might sound as an extra level of indirection, but for a simple app it's as trivial as this:
version: "3.3"
services:
build: .
python_app:
environment:
- PYTHONUNBUFFERED=1
That's it. The benefit of having it is that you don't need to pass a lot of flags that docker require because they are added automatically. It also simplifies work with volumes and env vars if they become required later.
You can they view logs by service name:
docker-compose logs python_app
By the way, I'd rather set PYTHONUNBUFFERED=1
if I'm testing something locally. It disabled buffering, which makes logging more deterministic locally. I had a lot of logging problems, for example, when I tried to spin up grpc server in my python app because the logs that are flushed before the server starts were not all init logs I wanted to see. And once the server starts, you will not see the init logs because the logging reattaches to a different/spawned process.
Upvotes: 1
Reputation: 1290
For Python to log on your terminal/command line/console, when executed from a docker container, you should have this variable set in your docker-compose.yml
environment:
- PYTHONUNBUFFERED=0
This is also a valid solution if you're using print
to debug.
Upvotes: 28