Mr. B.
Mr. B.

Reputation: 8697

Python: how to avoid log flooding?

I'm looking for a solution to avoid log flooding. To do so, I'd like to ignore repetitive errors for a certain amount of time.

import time

def log_error(error):
    # TODO: only log if the last log was at least 1 minute ago
    pass

def do_something_that_fails():
    return 1/0
    
while 1:
    try:
        do_something_that_fails()  # Fails every 5 seconds
    except Exception as error:
        log_error(error)  # Should not log every 5 seconds
    time.sleep(5)

Are there any in-built features of the Python logging module that I'm missing? Or proper ways to compare exceptions?

Upvotes: 0

Views: 335

Answers (1)

Daweo
Daweo

Reputation: 36430

I suggest taking look at logging.Filter class consider following simple example which requires 3 seconds to elapse before accepting next logging event from same line, let file.py content be

import collections
import logging
import time


class CustomFilter(logging.Filter):
    last_events = collections.defaultdict(float)
    def filter(self, record):
        prev_time = self.last_events[record.lineno]
        if prev_time + 3 <= record.created:
            self.last_events[record.lineno] = record.created
            return True
        else:
            return False


customLogger = logging.Logger('customLogger')
customLogger.addFilter(CustomFilter())
for i in range(7):
     customLogger.error('some error')
     time.sleep(1.1)

then

python file.py

gives

some error
some error
some error

Explanation: I create CustomFilter which stores time of last event for each line in class-attribute last_events, if it is first event for given line I assume previous happend at start of epoch (1 Jan 1970) which should be enough for all practical purpose. .filter method is used for filtering, 0 or False means no go, non-zero or True means go.

Upvotes: 1

Related Questions