Reputation: 554
I have a nohup command that is executed in a Linux server.
nohup python /home/user/code.py 1>>/home/user/python_log 2>&1
Python run log is logged under python_log
.
Issue is each run generates too much log that the size of the python_log
is currently about 650 MB.
Is there a way to separate the logs based on the date (or even a incrementing number to resolve the file size issue) of when the cron job runs in procedural manner. For example: python_log_17_08_2021
Upvotes: 0
Views: 549
Reputation: 809
This should be what you asked for.
nohup python /home/user/code.py 1>>"/home/user/python_log_$(date +"%d_%m_%Y")" 2>&1
I usually save dates the other way around though.
nohup python /home/user/code.py 1>>"/home/user/python_log_$(date +"%Y_%m_%d")" 2>&1
The answer above is just saving the stdout and stderr to the file. Python has it's own logging module which should be used for more complex scripts and applications. Read more in the Python Logging HOWTO.
Depending on your distribution there are tools available to handle your log files and rotate them depending on age and or size.
If you're working on a server the SysAdmin should have a space limit for your user in place or you should be using a seperate partition for the log files to ensure you do not potentially crash the system with bloated log files.
Upvotes: 1
Reputation: 7913
You should probably configure the logging properly, using a TimedRotatingFileHandler and use the stderr/stdout logs for unhandled logs.
Upvotes: 0