Mr. B.
Mr. B.

Reputation: 8697

How to avoid bloated log files in Python?

My script is executed every 5 seconds. Errors are logged to a file. Which means, if there is an error, the log file is bloated by the same error every 5 seconds.

while True:
    try:
        i_might_fail()   # As long as this line fails...
    except Exception as ex:
        logger.error(ex) # ... the log file gets bloated
    time.sleep(5)

It's impossible to terminate the script. It has to retry every 5 seconds.

I'm looking for a logging feature to ignore the same exception for x minutes:

logger.ignore_duplicates_for(10, 'minutes')

Any idea? Thanks in advance!

Upvotes: 2

Views: 85

Answers (1)

sanyassh
sanyassh

Reputation: 8510

This feature can be implemented in this way:

import logging
import time
import datetime

logger = logging.getLogger(__file__)


TIMEDELTA = datetime.timedelta(seconds=5)


def error_without_duplicates(self, msg, *args, **kwargs):
    if not hasattr(self, 'msg_cache'):
        self.msg_cache = {}
    str_msg = str(msg)
    now = datetime.datetime.utcnow()
    if str_msg not in self.msg_cache:
        self.error(msg, *args, **kwargs)
        self.msg_cache[str_msg] = now
    elif now - self.msg_cache[str_msg] > TIMEDELTA:
        self.error(msg, *args, **kwargs)
        self.msg_cache[str_msg] = now


logging.Logger.error_without_duplicates = error_without_duplicates


while True:
    try:
        a = 1 /0
    except Exception as ex:
        logger.error_without_duplicates(ex)  # every 5 seconds, not 1
    time.sleep(1)

Upvotes: 2

Related Questions