Reputation: 147
I've a file with a list of files in different directories and want to find the oldest one. It feels like something that should be easy with some shell scripting but I don't know how to approach this. I'm sure it's really easy in perl and other scripting languages but I'd really like to know if I've missed some obvious bash solution.
Example of the contents of the source file:
/home/user2/file1
/home/user14/tmp/file3
/home/user9/documents/file9
Upvotes: 2
Views: 577
Reputation: 45576
#!/bin/sh
while IFS= read -r file; do
[ "${file}" -ot "${oldest=$file}" ] && oldest=${file}
done < filelist.txt
echo "the oldest file is '${oldest}'"
Upvotes: 4
Reputation: 182
Use find() to find the oldest file:
find /home/ -type f -printf '%T+ %p\n' | sort | head -1 | cut -d' ' -f2-
And with source file:
find $(cat /path/to/source/file) -type f -printf '%T+ %p\n' | sort | head -1 | cut -d' ' -f2-
Upvotes: 1
Reputation: 11786
You can use stat
to find the last modification time of each file, looping over your source file:
oldest=5555555555
while read file; do
modtime=$(stat -c %Y "$file")
[[ $modtime -lt $oldest ]] && oldest=$modtime && oldestf="$file"
done < sourcefile.txt
echo "Oldest file: $oldestf"
This uses the %Y
format of stat
, which is the last modification time. You could also use %X
for last access time, or %Z
for last change time.
Upvotes: 2