Mercer
Mercer

Reputation: 9986

Script for archiving log files

i have created a Bash script for archiving log files:

#!/bin/bash

# Pour chaque dossiers "log" trouvé.
for folder in $(find . -name log -type d )
do :
    # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
    for file in $(find $folder -name "*.log" -mtime +30)
    do :
            # Archiver les fichiers ".log".
        tar czf archive-log.tar.gz $folder/*.log
    done

    # Si une archive existe.
    if [ -e $folder/*.tar.gz ]
         # Déplacer l'archive.
         then mv $folder/*.tar.gz $ARCHIVE
    fi

done

the output i experience is:

[logs]$ ll
total 8
drwxr-xr-x 2 webadm webgrp 4096 sep 17 14:26 log_weblogic
-rwxr--r-- 1 webadm webgrp  456 sep 17 14:31 scriptArchivesLog

[log_weblogic]$ ll
total 200
-rw-r----- 1 webadm webgrp 98005 mai 16 04:04 test.log
-rw-r----- 1 webadm webgrp 98005 sep 13 15:29 WLS-CONSOLE-DOMAINE-PUB.log


[logs]$ ll
total 32
-rw-r--r-- 1 webadm webgrp 21734 sep 17 14:31 archive-log.tar.gz
drwxr-xr-x 2 webadm webgrp  4096 sep 17 14:26 log_weblogic
-rwxr--r-- 1 webadm webgrp   456 sep 17 14:31 scriptArchivesLog

When i execute my script, why do i have all files in my archive? I want only files that match mtime +30

[logs]$ tar tvzf archive-log.tar.gz
-rw-r----- webadm/webgrp 98005 2013-05-16 04:04:00 ./log_weblogic/test.log
-rw-r----- webadm/webgrp 98005 2013-09-13 15:29:03 ./log_weblogic/WLS-CONSOLE-DOMAINE-PUB.log

Upvotes: 1

Views: 24997

Answers (5)

pndc
pndc

Reputation: 3795

You have made the critical error of not checking to see if there is already a program or library around that already does what you want. In this case, there is logrotate which is probably already present on your system, diligently cleaning up the system logfiles in /var/log. As a bonus, it will also already be configured to run periodically, so you won't even have to remember to set up a cron job.

There is a tutorial on using logrotate at https://www.linode.com/docs/uptime/logs/use-logrotate-to-manage-log-files

Upvotes: 10

devnull
devnull

Reputation: 123458

Replace the following:

for file in $(find $folder -name "*.log" -mtime +30)
do :
        # Archiver les fichiers ".log".
    tar czf archive-log.tar.gz $folder/*.log
done

with

tar czf archive-log.tar.gz $(find $folder -name "*.log" -mtime +30)

Upvotes: 2

konsolebox
konsolebox

Reputation: 75478

This is the wrong part:

for file in $(find $folder -name "*.log" -mtime +30) # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
do :
    tar czf archive-log.tar.gz $folder/*.log # Archiver les fichiers ".log".
done

Despite searching for files, tar czf archive-log.tar.gz $folder/*.log would still archive all those files.

You can change that to something like this instead:

readarray -t files < <(exec find $folder -name "*.log" -mtime +30) ## read those files (list) to array
[[ ${#files[@]} -gt 0 ]] && tar czf archive-log.tar.gz "${files[@]}"  ## create an archive if a file was found.

Upvotes: 0

anubhava
anubhava

Reputation: 785068

It is because of your tar command:

tar czf archive-log.tar.gz $folder/*.log

Which is actually archiving all the *.log files irrespective of the timestamp on those files.

gnu-tar has a switch:

--newer-mtime=date

For this use-case.

Upvotes: 1

Srini V
Srini V

Reputation: 11355

Try this:

for folder in $(find . -name log -type d ) # Pour chaque dossiers "log" trouvé.
do :
    for file in $(find $folder -name "*.log" -mtime +30) # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
    do :
        tar czf archive-log.tar.gz $folder/*.log && rm -f $folder/*.log  # Archiver les fichiers ".log". and remove the copy
    done

    if [ -e $folder/*.tar.gz ] # Si une archive existe.
         then mv $folder/*.tar.gz $ARCHIVE # Déplacer l'archive.
    fi

done

Upvotes: 0

Related Questions