Vinay
Vinay

Reputation: 470

bash continue execution on command failure

#! /bin/bash

while :
do
    filenames=$(ls -rt *.log | tail -n 2)
    echo $filenames
    cat $filenames > jive_log.txt
    sleep 0.1
done

I am trying to read latest 2 files from a directory and join them using bash. However when no files are present in the current directory with an extension .log the command ls -rt *.log fails with error "ls: cannot access *.log: No such file or directory". After the error it looks like the while loop does not execute. AfterWhat do I do so that the infinite loop continues even if one command fails.

Upvotes: 0

Views: 608

Answers (2)

ceving
ceving

Reputation: 23871

If you want to sort the output of find, you have to add a sort key at the beginning, which can be removed later on.

find . -name \*.log -printf '%T+\t%p\n' |
sort -r |
head -2 |
cut -f 2-

Using head instead of tail is a bit cheaper.

Upvotes: 1

konsolebox
konsolebox

Reputation: 75588

I'm not sure what you mean but perhaps:

for (( ;; )); do
    while IFS= read -r FILE; do
        cat "$FILE"
    done < <(exec ls -rt1 *.log  | tail -n 2) >> jive_log.txt
    sleep 1
done

Note the ls option -1 which prints out files line by line.

Anyhow you can join last two files to jive_log.txt with:

while IFS= read -r FILE; do
    cat "$FILE"
done < <(exec ls -rt1 *.log  | tail -n 2) >> jive_log.txt

Another way is to save it to an array (e.g. with readarray) then pass the last 2 elements to cat.

readarray -t FILES < <(exec ls -rt1 *.log)
cat "${FILES[@]:(-2)}" > jive_log.txt  ## Or perhaps you mean to append it? (>>)

Upvotes: 1

Related Questions