hash
hash

Reputation: 103

count number of lines from several files and store count from all text files into one variable using for loop

I want to count number of lines from many text files and then store them into a variable to find the lowest number. I am trying to do this in for loop but it stores only the result from last text file in loop.

 for txt in home/data/*.txt
 do
       count_txt=$(cat $txt | wc -l) | bc

 done

Thanks

Upvotes: 1

Views: 108

Answers (4)

bachN
bachN

Reputation: 612

Update : According to EdMorton comment, awk is the right tool to use to solve this kind of problem, this approach isn't a final implementation and it fails for some filenames ( like filenames with spaces ),to conclude, awk is way more performant and reliable

If you want to use a for loop, you can do something like this :

#!/bin/bash
MAX="0"
MIN="INIT"
for F in home/data/*.txt
do
    NBLINE=$(cat $F | wc -l)
    if [[ "$NBLINE" -gt "$MAX"  ]] ; then
            MAX="$NBLINE"
            BIG_FILE="$F"
    fi
    if [[ "$MIN" == "INIT" ]] ; then
            MIN="$NBLINE"
            SMA_FILE="$F"
    fi
    if [[ "$NBLINE" -lt "$MIN" ]] ; then
            MIN="$NBLINE"
            SMA_FILE="$F"
    fi
done
echo "File = $BIG_FILE  -- Lines = $MAX"
echo "File = $SMA_FILE  -- Lines = $MIN"
exit

Upvotes: 0

Ed Morton
Ed Morton

Reputation: 204229

You just need something like this (using GNU awk for ENDFILE):

awk 'ENDFILE{min = (min < FNR ? min : FNR)} END{print min}' home/data/*.txt

Upvotes: 1

konsolebox
konsolebox

Reputation: 75568

shopt -s nullglob
FILES=(home/data/*.txt) LOWEST_COUNT='N/A' FILE=''
[[ ${#FILES[@]} -gt 0 ]] && read -r LOWEST_COUNT FILE < <(exec wc -l "${FILES[@]}" | sort -n)
echo "$LOWEST_COUNT | $FILE"

Upvotes: 1

Kent
Kent

Reputation: 195209

give this one-liner a try:

wc -l /path/*.txt|awk 'NR==1{m=$1}{m=($1*1)<m?($1*1):m}END{print m}'

Upvotes: 1

Related Questions