Reputation: 121
I'm having trouble with using logging for multiprocessing Processes. As I understand all processes are separate ones so each process have their own logger. What I try to do is in the main function I set up a logger and when I start the processes, pass it to all processes with the same configuration (so they log to the same file). My issue is that even though they get passed, the handlers are empty. Why is that?
I found a workaround (passing the config parameters as a list of strings and setting them up again at the beginning of the processes) but I wonder what causes this behaviour.
The code I used:
from multiprocessing import Process, current_process
import logging
def a(x, logger_normal):
logger_normal.debug(f'{current_process().name} - Value of x is {x}')
if __name__ == '__main__':
logger_normal = logging.getLogger('test')
logger_normal.setLevel(logging.DEBUG)
fh_formatter = logging.Formatter('%(asctime)-15s - %(levelname)s - %(lineno)d - %(message)s')
f_handler_normal = logging.FileHandler('normal_debug_log.log')
logger_normal.addHandler(f_handler_normal)
logger_normal.debug('test log main')
p1 = Process(target=a, args=(1, logger_normal))
p2 = Process(target=a, args=(2, logger_normal))
p1.start()
p2.start()
p1.join()
p2.join()
This only writes 'test log main' into the file.
This is the code with the workaround:
from multiprocessing import Process
import logging
def a(x, logger_normal, logger_config):
logger_normal.setLevel(logger_config[0])
fh_formatter = logging.Formatter(logger_config[1])
f_handler_normal = logging.FileHandler(logger_config[2])
f_handler_normal.setFormatter(fh_formatter)
logger_normal.addHandler(f_handler_normal)
logger_normal.debug(f'{multiprocessing.current_process().name} - Value of x is {x}')
if __name__ == '__main__':
logger_config = [logging.DEBUG, '%(asctime)-15s - %(levelname)s - %(lineno)d - %(message)s',
'normal_debug_log.log']
logger_normal = logging.getLogger('test')
logger_normal.setLevel(logger_config[0])
fh_formatter = logging.Formatter(logger_config[1])
f_handler_normal = logging.FileHandler(logger_config[2])
f_handler_normal.setFormatter(fh_formatter)
logger_normal.addHandler(f_handler_normal)
logger_normal.debug('test log main')
p1 = Process(target=a, args=(1, logger_normal, logger_config))
p2 = Process(target=a, args=(2, logger_normal, logger_config))
p1.start()
p2.start()
p1.join()
p2.join()
Upvotes: 0
Views: 549
Reputation: 2419
Python child processes on Windows rerun your script from scratch (spawm
method) except the code inside the if __name__ == '__main__'
-> logger is not inherited
A created logger is not a picklable object, so it cannot be sent through the multiprocessing.Queue
or multiprocessing.Pipe
-> There is no simple way to pass a logger to child-processes at start-up
The logging module's workaround for this problem is simply to recreate the logger from scratch when you pass an instance of it as Process
's argument on Windows. So when a logger appears in the child-process, all your previously set up handlers in the main process are not there.
Logging Some support for logging is available. Note, however, that the logging package does not use process shared locks so it is possible (depending on the handler type) for messages from different processes to get mixed up.
multiprocessing.get_logger() Returns the logger used by multiprocessing. If necessary, a new one will be created.
When first created the logger has level logging.NOTSET and no default handler. Messages sent to this logger will not by default propagate to the root logger.
Note that on Windows child processes will only inherit the level of the parent process’s logger – any other customization of the logger will not be inherited.
Upvotes: 1