Jessie Wilson
Jessie Wilson

Reputation: 109

A Pythonic way to delete older logfiles

I'm just cleaning log files greater than 50 (by oldest first). This is the only thing I've been able to come up with and I feel like there is a better way to do this. I'm currently getting a pylint warning using the lambda on get_time.

def clean_logs():
    log_path = "Runtime/logs/"
    max_log_files = 50

    def sorted_log_list(path):
        get_time = lambda f: os.stat(os.path.join(path, f)).st_mtime
        return list(sorted(os.listdir(path), key=get_time))

    del_list = sorted_log_list(log_path)[0:(len(sorted_log_list(log_path)) - max_log_files)]

    for x in del_list:
        pathlib.Path(pathlib.Path(log_path).resolve() / x).unlink(missing_ok=True)


clean_logs()

Upvotes: 1

Views: 617

Answers (1)

s3dev
s3dev

Reputation: 9681

The two simplified solutions below are used to accomplish different tasks, so included both for flexibility. Obviously, you can wrap this in a function if you like.

Both code examples breaks down into the following steps:

  • Set the date delta (as an epoch reference) for mtime comparison as, N days prior to today.
  • Collect the full path to all files matching a given extension.
  • Create a generator (or list) to hold the files to be deleted, using mtime as a reference.
  • Iterate the results and delete all applicable files.

Removing log files older than (n) days:

import os
from datetime import datetime as dt
from glob import glob

# Setup
path = '/tmp/logs/'
days = 5
ndays = dt.now().timestamp() - days * 86400

# Collect all files.
files = glob(os.path.join(path, '*.sql.gz'))
# Choose files to be deleted.
to_delete = (f for f in files if os.stat(f).st_mtime < ndays)

# Delete files older than (n) days.
for f in to_delete:
    os.remove(f)

Keeping the (n) latest log files

To keep the (n) latest log files, simply replace the to_delete definition above with:

n = 50
to_delete = sorted(files, key=lambda x: os.stat(x).st_mtime)[:len(files)-n]

Upvotes: 2

Related Questions