Reputation: 341
I have a file called Poller.log and it's appended by log details all the time. I want this log file to be rotated everyday and limited by 30 days. Thus, the code works well.
Now I want this logs that has been rotated to be in a folder (i.e. logs/poller.log.2011-03-04_15-36). Is there anyway to direct where this rotated file should be created?
This python script will be executed by Cron.
import logging
import logging.handlers
LOG_FILENAME = '/home/stackoverflow/snmpdata/poller.log'
# Set up a specific logger with our desired output level
poll_logger = logging.getLogger('pollerLog')
# Add the log message handler to the logger
log_rotator = logging.handlers.TimedRotatingFileHandler(LOG_FILENAME, when='d', interval=1, backupCount=30, encoding=None, delay=False, utc=False)
poll_logger.addHandler(log_rotator)
# Roll over on application start
poll_logger.handlers[0].doRollover()
Upvotes: 5
Views: 8880
Reputation: 6276
The BaseRotatingHandler
class in logging module provides an interface
class BaseRotatingHandler:
def rotation_filename(default_name):
if not callable(self.namer):
result = default_name
else:
result = self.namer(default_name)
return result
So, you can create your own customer RotatingHandler like this:
import datetime
import re
class DayRotatingHandler(RotatingFileHandler):
@staticmethod
def get_previous(name):
basename, log_index = re.match(r"(.*)\.(\d+)$", name)
date = datetime.datetime.now() - datetime.timedelta(days=int(log_index))
return f"logs/{basename}.{date}"
namer = get_previous
Upvotes: 0
Reputation: 24
I added this bit of code for a separate process to move any log backups to a folder.
import logging
import logging.handlers
import shutil, os, glob
import zipfile
import schedule
import time
import threading
zip_file_name = "Log.zip"
zip_file_path = "Logs/LogsArchive/Log.zip"
source_directory = "Logs"
archive_directory = "Logs/LogsArchive"
def moveAllFilesinDir(srcDir, dstDir, allLogs = False):
try:
# Check if both the are directories
if os.path.isdir(srcDir) and os.path.isdir(dstDir):
# Iterate over all the files in source directory
if allLogs == False:
for filePath in glob.glob(srcDir + '/*.*.*'):
# Move each file to destination Directory
shutil.move(filePath, dstDir)
elif allLogs == True:
for filePath in glob.glob(srcDir + '/*.*'):
# Move each file to destination Directory
shutil.copy(filePath, dstDir)
else:
debug_logger.debug("LoggingModule: - moveAllFilesinDir - srcDir & dstDir should be Directories")
except Exception as ex:
error_logger.error("Error in LoggingModule - moveAllFilesinDir", exc_info=True)
Only log files with a 3 part extension will be moved over "name.log.date" I am working on a process to zip the archive folder now.
update: Here's is the Zip process
def createZipDir(path):
#delete old zipfile if exists, but leave old zipfile if no other files exist
if len(os.listdir(path)) > 1:
zipFile = zip_file_path
if os.path.isfile(zipFile):
os.remove(zipFile)
zipf = zipfile.ZipFile(zip_file_path, 'w', zipfile.ZIP_DEFLATED)
for root, dirs, files in os.walk(path):
for file in files:
if file != zip_file_name:
zipf.write(os.path.join(root, file))
zipf.close()
else:
debug_logger.debug("LoggingModule: - createZipDir - no files found, zip file left in place.")
Deleting the old files:
def deleteOldFilesinDir(srcDir):
try:
# Check if both the are directories
if os.path.isdir(srcDir):
# Iterate over all the files in source directory
for filePath in glob.glob(srcDir + '/*.*'):
if filePath != zip_file_path:
os.remove(filePath)
else:
print("srcDir & dstDir should be Directories")
except Exception as ex:
error_logger.error("Error in LoggingModule - deleteOldFilesinDir", exc_info=True)
Here's the whole process:
I have the runArchiveProcess set on a schedule to run once a week.
def runArchiveProcess(allFiles = False):
debug_logger.debug("LoggingModule: Archive process started.")
moveAllFilesinDir(source_directory, archive_directory, allFiles)
createZipDir(archive_directory)
deleteOldFilesinDir(archive_directory)
debug_logger.debug("LoggingModule Archive process completed.")
And the scheduler bit:
#only kicked off in own thread...
def runScheduler():
debug_logger.debug("LoggingModule - runScheduler - don't call this function outside of LoggingModule as it runs in own thread.")
schedule.every().monday.at("00:00:00").do(runArchiveProcess)
#schedule.every(10).seconds.do(runArchiveProcess).do(runArchiveProcess) #for testing
try:
while True:
debug_logger.debug("LoggingModule checking scheduler...")
#Checks whether a scheduled task is pending to run or not
schedule.run_pending()
debug_logger.debug("LoggingModule Scheduler sleeping...")
time.sleep(60 * 60) # checks every 1 hour
#time.sleep(10) # for testing
except Exception as ex:
error_logger.error("Error in LoggingModule - runScheduler", exc_info=True)
def runSchedulerThread():
thread = threading.Thread(target=runScheduler)
thread.start()
Upvotes: 0
Reputation: 34698
If you don't mind the extra dependency you could always use the rollover logging module in twisted. Twisted has a logfile module that allows for daily logs, weekly logs, or even monthly logs like this situation.
Upvotes: 2
Reputation: 13251
Python logging handler don't allow to do that easily. You might have 2 way of achieve this :
The simplest way would to setup LOG_FILENAME to be already in logs/poller.log, and if you want to access to your poller.log anywhere else, use a symlink :)
Create your own handler starting from TimedRotatingFileHandler, and copy/paste the doRollover() from /usr/lib/python2.X/logging/handlers.py, TimedRotatingFileHandler class. And change :
dfn = self.baseFilename + "." + time.strftime(self.suffix, timeTuple)
to
dfn = os.path.join('logs', os.path.basename(self.baseFilename)) + "." + time.strftime(self.suffix, timeTuple)
Upvotes: 5