Reputation: 5646
I have this scenario.
We have a log archiving script which archives logs that are older than 8 days. We want to make sure that we don't lose any logs until we've archived them.
I have set maxBackupIndex
to 10 and maxFileSize
to 10MB. If I get more transactions within 8 days, which take more than 100MB of logs, then there is a possibility that I might lose some logs.
How can I avoid this situation? Can I set maxBackupIndex
to infinity?
Thanks in advance for any help on this issue.
Upvotes: 2
Views: 1026
Reputation: 6428
Set maxBackupIndex and maxFileSize to numbers high enough to allow retention of a reasonable workload, but not so high as to cause disk space to be exceeded.
Then create an external task (cron or windows scheduled task) to remove/archive logs older than 8 days.
Upvotes: 2
Reputation: 533880
You can also have a script to compress files older than a day if you have trouble retaining the logs.
maxFileSize
is an int
. You can set it to 1000000000 or whatever would use too much disk space. Youc an also increase the maxFileSize
to 100MB or 1GB.
Upvotes: 1