Dan G
Dan G

Reputation: 396

Python logging from multiple Docker containers back to localhost

I have a scenario where I have created an emulator in Python3 of a test node that can be launched in a Docker container.

So basically, on one server running Ubuntu 18.04, I have 50 ~ 100 containers, each one emulating a node and performing a basic file transfer task.

Each container is running a Python3 application that emulates a node. For logging purposes, I have the following:

import logging

logging.basicConfig (format='%(asctime)s : %(message)s', filename='test.log', datefmt='%Y-%m-%d %H:%M:%S', level=logging.DEBUG)

So basically by executing:

logging.error ("File transfer failed")

I get a log file test.log with the proper formatted time stamp and error message.

The issue is this is occurring inside the container, and for that matter, inside 50 ~ 100 containers.

Is there a way to have all the containers logging to a single log file on the localhost where the containers exist? I have looked at log handlers in Python but cannot seem to wrap my head around getting out of the container and writing to file on local host.

Upvotes: 0

Views: 796

Answers (2)

Itamar Turner-Trauring
Itamar Turner-Trauring

Reputation: 3900

The default idiom for Docker logging is to log to stdout. So just don't specify a file when you do basicConfig() and logs will go there by default.

You can then access those logs with docker logs command.

Upvotes: 0

Klaus
Klaus

Reputation: 1731

How about using a docker volume. docker volumes can be used to persist data to an external file system. By doing so, your containers will have the access to read and write to your local hard drive instead of creating log files inside of the containers itself.

But you may have to find a way to avoid race conditions, to write to the shared location.

Read about docker volumes in their official docs. It's pretty easy.

Upvotes: 1

Related Questions