KFL
KFL

Reputation: 17850

How to configure all loggers in an application

Python's logging module lets modules or classes define their own loggers. And different loggers can have different handlers. Some of them may choose to log to a file, while some choose to log to, say, stdout.

Now my application uses several of these modules, each with their own loggers that have various handlers. Can I unify the logging behavior so that all logs go to a log file that I specified? In other words, is there a way to .config() all the loggers' handlers at once, from a single place?

Upvotes: 21

Views: 17552

Answers (3)

user16776498
user16776498

Reputation:

Here are some good resources:


Briefly, (as far as I understand)

  • The logging module provides hierarchical loggers meaning if the root logger (the logger you get with logging.getLogger()) is formatted in a certain way all the loggers with other names, (logging.getLogger("other_logger")) will be formatted the same (unless if you set propagate to False)

  • The best practice for big projects as explained in the links above will be to define a logger configuration at the beginning of your package (i.e in __main__.py) and then just call

    logging.getLogger(__name__)
    

Example:

source code here

project:

logs
├── errors.log # will be created automatically
├── std.log # will be created automatically
src
├── animals
│   ├── __init__.py       
│   ├── dog.py
|   |── cat.py
|   |── fish.py
└── main.py

Inside main.py:

import logging
from logging.config import dictConfig

LOG_CONFIG = {
    'version': 1,
    'handlers': {
        'console': {
            'level': 'DEBUG',
            'formatter': 'std',
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout'
        },
          'my_detailed_console': {
            'level': 'WARNING',
            'formatter': 'error',
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout'
        },
        'std_fh': {
            'level': 'INFO',
            'formatter': 'std',
            'class': 'logging.handlers.RotatingFileHandler',
            'filename': 'logs/std.log',
            'mode': 'a',
            'maxBytes': 1048576,
            'backupCount': 10
        },
        'my_detailed_fh': {
            'level': 'WARNING',
            'formatter': 'error',
            'class': 'logging.handlers.RotatingFileHandler',
            'filename': 'logs/errors.log',
            'mode': 'a',
            'maxBytes': 1048576,
            'backupCount': 10
        }
    },
    'loggers': {
        '': {  # root logger, all other loggers will be of this logger level implicitlly.
            'level':'DEBUG',
            'handlers': ['std_fh', 'console'], 
            
        },
        'my_detailed': {
            'level': 'WARNING',
            'propagate': False,
            'handlers': ['my_detailed_fh','my_detailed_console'],
          
        },
         'my_normal': {
            'level': 'INFO',
            'propagate': False,
            'handlers': ['std_fh','console'],
          
        }
    },
    'formatters': {
        'std': {
            'format': '[%(levelname)s  - %(asctime)s - %(name)s::] %(message)s'
        },
        'error': {
            'format': '[%(levelname)s - %(asctime)s - %(name)s - %(process)d::module :%(module)s|Line: %(lineno)s]  messages:[ %(message)s ]'
        },

    }
}

logging.config.dictConfig(LOG_CONFIG)
root_logger = logging.getLogger(__name__) # this is a root logger
my_normal_logger = logging.getLogger('my_normal.' + __name__) # this is an `src` logger 
my_detailed_logger = logging.getLogger('my_detailed.' + __name__) # this is 'my_detailed' 

def main():
    root_logger.debug("hello from root logger")
    my_normal_logger.debug("won't print") # higher level needed
    my_normal_logger.info("hello from my_normal logger")
    my_detailed_logger.info("won't print") # higher level needed
    my_detailed_logger.warning("hello from my_detailed logger") 

    import animals.cat
    import animals.cow
    import animals.fish
    
if __name__ == '__main__':
    main()

This is what you do in all animals

# cat.py
import logging

logger = logging.getLogger(__name__)

logger.info('mew mew')

Run main.py

output:

[DEBUG  - 2022-06-21 22:52:06,682 - __main__::] hello from root logger
[INFO  - 2022-06-21 22:52:06,682 - my_normal.__main__::] hello from my_normal logger
[WARNING - 2022-06-21 22:52:06,682 - my_detailed.__main__ - 15177::module :main|Line: 78]  messages:[ hello from my_detailed logger ]
[INFO  - 2022-06-21 22:52:06,683 - animals.cat::] mew mew
[INFO  - 2022-06-21 22:52:06,683 - animals.cow::] mooooo
[INFO  - 2022-06-21 22:52:06,683 - animals.fish::] blop blop

Note: I would recommend using the dict config instead of a file config for security measures (search here for eval()).

Upvotes: 4

Seth
Seth

Reputation: 6832

From Logging HOWTO:

Child loggers propagate messages up to the handlers associated with their ancestor loggers. Because of this, it is unnecessary to define and configure handlers for all the loggers an application uses. It is sufficient to configure handlers for a top-level logger and create child loggers as needed. (You can, however, turn off propagation by setting the propagate attribute of a logger to False.)

Any handlers you add to the root logger will be used when child loggers create log entries.

import logging

root_handler = ...

root_logger = logging.getLogger()
root_logger.addHandler(root_handler)  # Will receive all log entries

# Meanwhile in a module...

import logging

logger = logging.getLogger(__name__)

logger.error(...)  # Will go to root_handler

Upvotes: 6

newtover
newtover

Reputation: 32094

You should probably look into the Python Logging HOWTO to understand how it works.

In short, all that modules usually do is getting a logger of the form G_LOG = logging.getLogger('package.name') and sending messages to the logger: G_LOG.info('some message'), G_LOG.exception('something bad happened'). Modules should not usually configure anything.

The application that uses the modules can turn the logging on and configure the handlers based on the logger names:

  • listen all messages, or
  • listen only messages above a certain threshold, or
  • listen messages only from loggers whose name starts with package, or
  • listen messages only from loggers whose name starts woth package.name, etc

The easiest way is to configure logging through logging.basicConfig somewhere in the beginning of your application:

logging.basicConfig(level=logging.DEBUG,
                    format='%(asctime)s %(levelname)-8s %(message)s',
                    datefmt='%Y-%m-%d %H:%M:%S',
                    filename=log_file, filemode='a')

That way you will write all logging messages from all modules to the log_file.

If you need a more detailed logging strategy (put logs from different loggers to different files, or send stacktraces to a separate file), it is better to define a logging config file and configure logging using logging.config.dictConfig or logging.config.fileConfig.

P.S. I usually create two loggers as module variables:

G_LOG = logging.getLogger(__name__)
ST_LOG = logging.getLogger('stacktrace.' + __name__)

to G_LOG I send only one-line messages. To ST_LOG I duplicate important messages using ST_LOG.exception which implicitly has exc_info=True and writes the stacktrace of the current exception.

At the start of the application I load a configuration that configures two loggers (and two file handlers for them): one that receives messages that start with stacktrace and has propagate=0 (that is stacktrace messages are not visible at the top) and the root logger that handles the rest of the messages. I will not put here my complete log config files, since it is a useful home work to understand how it all works.

Upvotes: 24

Related Questions