John
John

Reputation: 11

Bash script to search csv file column and count how many times a value shows up

I am really new a bash and I was trying to search a csv file column for a value and then add a counter. I found this online but it prints it and I have been trying to count how many times an R shows up and not print the whole thing.

awk -F "\"*,\"*" '{print $2}' $file

The csv file is like:

12345,R,N,N,Y,N,N,N,Bob Builder

I am looking for R in column 2. Can anybody point me in the right direction?

Upvotes: 1

Views: 3717

Answers (3)

clt60
clt60

Reputation: 63962

For a fun - perl only - this count everything.

perl -F, -anle 'map{$cnt{$_}{$F[$_]}++}0..$#F;END{print $cnt{1}{R}}'

Upvotes: 1

Lawrence Woodman
Lawrence Woodman

Reputation: 1444

The following should do what you want (where file.csv is your csv file):

Case sensitive version:

cut -f 2 -d , file.csv | grep -c R

Case insensitive version:

cut -f 2 -d , file.csv | grep -ic R

Explanation

cut -f 2 -d , file.csv This takes each line of file.csv and extracts the specified fields. The -f 2 option means extract field 2 and the -d , means use a ',' as the field delimiter. The output of this is then piped to grep.

grep -c R This looks for lines containing 'R'. Since it is passed the contents of the previous cut command, it is looking for an 'R' in field two. The -c option means count the number of matching lines.

Upvotes: 5

anubhava
anubhava

Reputation: 785711

Using awk only:

awk -F "\",\"" '{if ($2 == "R") cnt++} END{print cnt}' file

Upvotes: 3

Related Questions