Reputation: 41
i run a list of tasks once an hour,but i want to get the log file named by current date.
def get_mylogger():
# get logger
fmt = '%(asctime)-15s %(levelname)-4s %(message)s'
datefmt = '%Y-%m-%d %H:%M:%S'
mylogger = logging.getLogger()
mylogger.setLevel(logging.INFO)
# log_path = "/opt/spark/logs/pyspark/"
log_path = "H:\upupw\www\spark\logs\pyspark"
if not os.path.exists(log_path):
os.makedirs(log_path)
log_file_name = 'spark.log'
log_name = os.path.join(log_path, log_file_name)
# TimedRotatingFileHandler
timer = TimedRotatingFileHandler(log_name, when='D')
formatter = logging.Formatter(fmt, datefmt=datefmt)
timer.setFormatter(formatter)
mylogger.addHandler(timer)
return mylogger
if i create the first log file 'spark.log' in '10:00:00', But it won't create a new file until '10:00:00' tomorrow. What I want is to create a new file tomorrow at 0!
Upvotes: 1
Views: 2390
Reputation: 171
According to the logging documentation for TimedRotatingFileHandler you might want to use the additional parameter atTime
within your TimedRotatingFileHandler
function call like this:
timer = TimedRotatingFileHandler(log_name, when='D', atTime=datetime.time(0, 0, 0))
Like described in the docu:
If atTime is not None, it must be a datetime.time instance which specifies the time of day when rollover occurs, for the cases where rollover is set to happen “at midnight” or “on a particular weekday”. Note that in these cases, the atTime value is effectively used to compute the initial rollover, and subsequent rollovers would be calculated via the normal interval calculation.
...you need to provide the rollover time as a instance of datetime.time()
. This is done by passing the wanted rollover time as arguments to the datetime.time class. Pass the hours as first, minutes as second and seconds as third argument. The above example sets the rollover time to 00:00:00.
Note: be sure to
import datetime
in the beginning of your code.
Upvotes: 1