Reputation:
I have this this bash script as a crontab running every hour. I want to keep the latest 1,000 images in a folder, deleting the oldest files. I don't want to delete by mtime because if no new files are being uploaded, I want to keep them, it's fine to keep if image is 1 day or 50 days old, I just want when image 1,001 is uploaded (newest) image_1 (oldest) will be deleted, cycling through folder to keep a static amount of 1,000 images.
This works, However at ever hour, there could be now 1,200 by the time it executes. Running the crontab every say minute seems to be overkill. Can I make it so once the folder hits 1,001 images it auto executes? Basically I want the folder to be self-scanning and keep the newest 1,000 images, deleted the oldest one.
#!/bin/sh
cd /folder/to/execute; ls -t | sed -e '1,1000d' | xargs -d '\n' rm
Upvotes: 2
Views: 1547
Reputation: 1511
keep=10 #set this to how many files want to keep
discard=$(expr $keep - $(ls|wc -l))
if [ $discard -lt 0 ]; then
ls -Bt|tail $discard|tr '\n' '\0'|xargs -0 printf "%b\0"|xargs -0 rm --
fi
This first calculates the number of files to delete, then safely passes them to rm
. It uses negative numbers intentionally, since that conveniently works as the argument to tail
.
The use of tr
and xargs -0
is to ensure that this works even if file names contain spaces. The printf
bit is to handle file names containing newlines.
EDIT: added --
to rm
args to be safe if any of the files to be deleted start with a hyphen.
Upvotes: 0
Reputation: 5482
Try the following script.It first checks the count in the current directory and then , if the count is greater than 1000 , it evaluates the difference and then gets the oldest such files.
#/bin/bash
count=`ls -1 | wc -l`
if [ $count -gt 1000 ]
then
difference=${count-1000}
dirnames=`ls -t * | tail -n $difference`
arr=($dirnames)
for i in "${arr[@]}"
do
echo $i
done
fi
Upvotes: 0