Reputation: 55
There is a folder where users place files at random times. These files are all read-only. We need to read files every hour and if there is a file in the last one hour we need to copy it(not delete or change) to another directory. The files are normally of pattern Bank_YYMMDD.csv where YYMMDD is variable ex: Bank_20200927.csv.
/outbound is where the files are placed and /daily is where the files need to be copied. I have tried the below code and it works fine.
find /outbound -type f -mmin -60 -name "Bank*" | xargs -ILIST cp LIST /daily/
The problem part: Once files are placed in /daily a separate business process checks and processes these files and after processing moves them to /processed. There is an additional requirement to check the processed folder i.e. /processed and if the file we are trying to place already exists in the /processed folder then do not copy this file to /daily.
Ex: 3 files are placed or modified in the last hour in /outbound Say the 3 files are Bank_20200927.csv, Bank_20200926.csv, and Bank_20200925.csv. Out of these 3 files, the process ran earlier and already picked Bank_20200926.csv, and Bank_20200925.csv and placed them in /daily. Then the business process ran and now /processed folder already has the Bank_20200926.csv and Bank_20200925.csv.
So this time even though Bank_20200926.csv and Bank_20200925.csv were picked as they were modified and satisfies the modified in last one hour check the only file to be copied to the /daily folder will be Bank_20200927.csv.
I am stuck with the part to check if /processed has the files and if not only then place it in /daily.
Kindly help.
Thanks in Advance.
Upvotes: 0
Views: 193
Reputation: 912
Could use something like that :
for i in /outbound/*.csv;do [[ $(($(date +"%s")-$(stat -c %Y $i))) -lt 3600 && ! -e /processed/${i##*/} ]] && cp $i /daily/;done
Upvotes: 1