Reputation: 4353
If i have a folder of text files, how can I get the average words per file using Bash commands?
I know that I can use wc -w
to get words per file, but i'm unsure of how to get the total number of words across all files, and then divide that number by the number of text files
Upvotes: 1
Views: 1063
Reputation: 212684
Huang's solution is very good, but will emit errors on any directories. And division is a bit of a pain, since all arithmetic in the shell is with integers. Here's a script that does what you want:
#!/bin/sh for file in *; do test -f "$file" || continue c=$( wc -w "$file" | awk '{print $1}' ) : $(( total += $c )) : $(( count += 1 )) done echo $total $count 10k / p | dc | sed 's/0*$//'
But bunting's awk solution is the way to go.
Upvotes: 0
Reputation: 22850
This recursively traverses the filesystem and counts all words and files. In the end it divides the total number of words by the number of files:
find . -type f -exec wc -w {} \; | awk '{numfiles=numfiles+1;total += $1} END{print total/numfiles}'
Upvotes: 5
Reputation:
It is just a piece of advice. You might use Loops and Variable Assignment.
Upvotes: 0
Reputation: 1865
You can get total word count by:
cat *.txt | wc -w
and file number by:
ls *.txt | wc -l
Then you can devide them.
Upvotes: 1