S Andrew
S Andrew

Reputation: 7198

Rotating file handler for JSON logs in Python

I am working on saving the JSON logs using Python. Below is the code:

log_file = 'app_log.json'

log_json = dict()
log_json["Data"] = {}

log_json['Data']['Key1'] = "value1"
log_json['Data']['alert'] = False
log_json['Data']['Key2'] = "N/A"

log_json['Created'] = datetime.datetime.utcnow().isoformat()

with open(log_file, "a") as f:
    json.dump(log_json, f)#, ensure_ascii=False)
    f.write("\n")

Now, the above code is generating the log file. But I have noticed that the file size is increasing a lot and in future, I might face disk space issue. I was wondering if there is any pre built rotating file handler available for JSON in which we can mention fixed size, let's say 100MB, and upon reaching this size it will delete and recreate the new file.

I have previously used from logging.handlers import RotatingFileHandler to do this in case of .log files but also want to do this for .json files.

Upvotes: 2

Views: 1533

Answers (3)

Chandan
Chandan

Reputation: 11797

  1. You can implement structured logging with RotatingFileHandler
import json
import logging
import logging.handlers
from datetime import datetime

class StructuredMessage:
    def __init__(self, message, /, **kwargs):
        self.message = message
        self.kwargs = kwargs

    def __str__(self):
        return '%s >>> %s' % (self.message, json.dumps(self.kwargs))

_ = StructuredMessage   # optional, to improve readability

log_json = {}
log_json["Data"] = {}

log_json['Data']['Key1'] = "value1"
log_json['Data']['alert'] = False
log_json['Data']['Key2'] = "N/A"

log_json['Created'] = datetime.utcnow().isoformat()

LOG_FILENAME = 'logging_rotatingfile_example.out'

# Set up a specific logger with our desired output level
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)

# Add the log message handler to the logger
handler = logging.handlers.RotatingFileHandler(
              LOG_FILENAME, maxBytes=20, backupCount=5)
bf = logging.Formatter('%(message)s')
handler.setFormatter(bf)

logger.addHandler(handler)
logger.info(_('INFO', **log_json))

Note: check here for more info about structured-logging-python

  1. you can use also use json-logging-python with RotatingFileHandler
import logging
import json
import traceback
from datetime import datetime
import copy
import json_logging
import sys

json_logging.ENABLE_JSON_LOGGING = True


def extra(**kw):
    '''Add the required nested props layer'''
    return {'extra': {'props': kw}}


class CustomJSONLog(logging.Formatter):
    """
    Customized logger
    """

    def get_exc_fields(self, record):
        if record.exc_info:
            exc_info = self.format_exception(record.exc_info)
        else:
            exc_info = record.exc_text
        return {'python.exc_info': exc_info}

    @classmethod
    def format_exception(cls, exc_info):
        return ''.join(traceback.format_exception(*exc_info)) if exc_info else ''

    def format(self, record):
        json_log_object = {"@timestamp": datetime.utcnow().isoformat(),
                           "level": record.levelname,
                           "message": record.getMessage(),
                           "caller": record.filename + '::' + record.funcName
                           }
        json_log_object['data'] = {
            "python.logger_name": record.name,
            "python.module": record.module,
            "python.funcName": record.funcName,
            "python.filename": record.filename,
            "python.lineno": record.lineno,
            "python.thread": record.threadName,
            "python.pid": record.process
        }
        if hasattr(record, 'props'):
            json_log_object['data'].update(record.props)

        if record.exc_info or record.exc_text:
            json_log_object['data'].update(self.get_exc_fields(record))

        return json.dumps(json_log_object)


json_logging.init_non_web(custom_formatter=CustomJSONLog, enable_json=True)

logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
LOG_FILENAME = 'logging_rotating_json_example.out'

handler = logging.handlers.RotatingFileHandler(
              LOG_FILENAME, maxBytes=20, backupCount=5)

logger.addHandler(handler)

log_json = {}
log_json["Data"] = {}

log_json['Data']['Key1'] = "value1"
log_json['Data']['alert'] = False
log_json['Data']['Key2'] = "N/A"

logger.info('Starting')
logger.debug('Working', extra={"props":log_json})

Note: check here for more info about json-logging-python

Upvotes: 1

lllrnr101
lllrnr101

Reputation: 2343

Python does not care about the log file name.

You can use the rotating handler which you used for .log file for .json file also.

See sample example below

# logging_example.py

import logging
import logging.handlers
import os
import time

logfile = os.path.join("/tmp", "demo_logging.json")

logger = logging.getLogger(__name__)

fh = logging.handlers.RotatingFileHandler(logfile, mode='a', maxBytes=1000, backupCount=5)  # noqa:E501

fh.setLevel(logging.DEBUG)

formatter = logging.Formatter("%(asctime)s - %(levelname)s - %(message)s")

fh.setFormatter(formatter)

logger.addHandler(fh)
logger.setLevel(logging.DEBUG)

while 1:
    time.sleep(1)
    logger.info("Long string to increase the file size")

You can also look at logrotate if you are working in Unix environment. It is a great and simple tool with good documentation to do just exactly what you need.

Upvotes: 1

video-reviews.net
video-reviews.net

Reputation: 2916

You can try this just before you write / append to the file. This should check to see if the file has reached the max lines size, then it will remove one line of code from the beginning of the file before you append as usual to the end of the file.

filename = 'file.txt'
maxLines = 100

count = len(open(filename).readlines())

if(count > maxLines) {
  with open(filename, 'r') as fin:
    data = fin.read().splitlines(True)
  with open(filename, 'w') as fout:
    fout.writelines(data[1:])
}

Upvotes: -1

Related Questions