Reputation: 123
I used the multiprocessing framework to create several parallel sub-process (via JoinableQueue), but I just set up the logging (using the normal python logging module) in my main thread. And as I test the code, it seems that all the sub-processes are able to put their logs into the single logfile that I specified in the starting of my main process with no issues.
However, according to the python logging cookbook, it says that the module logging is only thread-safe, but not process-safe. It suggests to use:
All the suggested solutions make sense to me, and I actually was able to implement solution #3 - it worked, no issues.
But, I do have the question about what would be the issue if we do not handle this well. What bad consequence might happen if I did not do any of #1,2,3 (as I described in the first paragraph)? And how can I make those bad consequence happen (I'm curious to see them)?
Upvotes: 2
Views: 219
Reputation: 16875
Generally you want log writes to be atomic in some fashion. That is, in this context, when something writes a chunk of text to a log, that chunk appears together rather than being split up and intermixed with the content of other log entries. If multiple processes try to write to a file without some kind of mediation, it can result in such intermixing or even clobbering of the content.
To purposely cause such a thing, have several processes write to the log repeatedly and simultaneously without mediation (no locks or handling processes) just as the documentation suggests you shouldn't. The more processes and the longer (partially dependent on buffer sizes) the writes are, the more likely you'll get intermixing.
Upvotes: 1