Reputation: 711
I know that docker-compose default logs to a file defined by docker inspect --format='{{.LogPath}}' my_container
. This file is gone as soon as I kill the container. As I deploy a new version of the image frequently, I loose a lot of log entries.
What I'd like to do is to have my container's log entries stored in a persistent log file, just like regular linux processes use. I can have my deployment script do something like this, but I'm thinking there's a less hack-ish way of doing this:
docker-compose logs -t -f >> output-`date +"%Y-%m-%d_%H%M"`.log'
One option would be to configure docker-compsose to log to syslog, but for the time being I'd like to log to a dedicated file.
How have others dealt with the issue of persistent logging?
Upvotes: 7
Views: 8300
Reputation: 314
So docker has a concept called logging-drivers. https://docs.docker.com/config/containers/logging/configure/#supported-logging-drivers
The default is the file that you mentioned. The ideal way to do this is to pass the --log-driver <driver-name>
to your run command. Then have another process on the same machine picking these up and pushing to your central logging system.
Most popular of these is fluentd
or splunk
, I guess. But you can also choose to write to json or journald.
The docker manual for these are below
Upvotes: 1