Reputation:
I have a script which runs hourly and i am storing failure data as follows:
2017/10/09/00/RetryFailure.txt
2017/10/09/01/RetryFailure.txt
2017/10/09/02/RetryFailure.txt ...
where 10 is the month, 09 is the day and 00,01,02 are hours. Now at the end of the day i want to concatenate all(24) RetryFailure.txt into one file say RetryFailure10.txt.
Can anyone tell me the command to do so?
Upvotes: 0
Views: 66
Reputation: 92854
Short find
+ cat
approach:
find . -type f -name RetryFailure.txt -exec cat {} + > RetryFailure_merged.txt
Upvotes: 0
Reputation: 785156
You can use this find
for aggregating all the files of same date:
find . -name 'RetryFailure.txt' -exec bash -c \
'IFS=/ read -ra arr <<< "$1"; cat "$1" >> "RetryFailure${arr[2]}.txt"' - {} \;
For better performance use a loop with process substitution:
while IFS= read -rd '' file; do
IFS=/ read -ra arr <<< "$file"
cat "$file" >> "RetryFailure${arr[2]}.txt"
done < <(find . -name 'RetryFailure.txt' -print0)
find
we find each RetryFailure.txt
fileread -ra
and IFS=/
we split each entry by /
and populate a shell arraycat ...
command we redirect each file into a new file using ${arr[2]}
Upvotes: 1
Reputation: 19315
cat 2017/10/*/RetryFailure.txt > concat_file
or more restrictive
cat 2017/10/{00..23}/RetryFailure.txt > concat_file
Upvotes: 0