AlwaysLearning
AlwaysLearning

Reputation: 41

Read command output into array and parsing array to convert into .csv

I essentially execute a command and have to iterate over it one file at a time. The output is read into an array. I want to do this only one time: 1. for efficiency and 2. the directory structure is constantly changing with new files being added hourly.

#Get file list
file_list=(`ls -lrt *.bin | awk '{print $9}'`)

#Get file output
output=$(for i in "${file_list[@]}"; do script.bash $i; done)

Now with the way $output is written, all data resides in a single element $output[0]. To extract and read this array line by line we can simply use read line which seems to work great.

# Read $output line by line to search for specific keywords and here 
# is where my problem lies
while read -r line
do
        var1=$(echo "${line}" | grep keyword1)
        var2=$(echo "${line}" | grep keyword2)
        echo "$var1,$var2"
done <<< $output

Unfortunately the above is not working how I want it and the result within the terminal prints blank lines and that is because well, var1 and var2 don't have a match. I'm really just trying to search the line for a specific keyword, parse it, store it in a variable and then finally print it in comma delimited format.

Desired Output:
Line1: $var1,$var2
Line2: $var1,$var2

Output for a single .bin file. These are the values I'm grepping for each line.

UD  :   JJ533
ID :   117
Ver :   8973
Time:   15545

Upvotes: 0

Views: 75

Answers (2)

vintnes
vintnes

Reputation: 2030

This is much cleaner if you stick to Awk. Replace /.*/ with any /regex/ you want to match, otherwise it prints all lines in each file.

 awk -F ' *: *' '
    FNR==1{print "\n---", "file", ++f ":", FILENAME, "---"}
    /.*/{print "Line" FNR ":", $1 "," $2}
' *.bin

Outputs:


--- file 1: f1.bin ---
Line1: UD,JJ533
Line2: ID,117
Line3: Ver,8973
Line4: Time,15545

--- file 2: f2.bin ---
Line1: UD,ZZ533
Line2: ID,118
Line3: Ver,9324
Line4: Time,15548

etc.

Upvotes: 0

lw0v0wl
lw0v0wl

Reputation: 674

First of all ,I know this is not the exact solution you looking for, and also lookup for keywords is missing. I still not sure what you like to achieve, it might because of my English.

I only hope this code might help you to achieve your goal.

# Awk not required here if you not use long listing -l
file_list=$(ls -rt *.bin)

let "x=1"
for filename in ${file_list[@]}; do

        echo '--- file #'"${x}"' ---'
        let "y=1"

        # as awk give output: "var1 var2", read can put them into different variable.
        while read col1 col2;do

                # col1 contain first column (ID UD Ver Time)
                # col2 contain the value
                echo "Line${y}: ${col1},${col2}"
                let y++

        # cat is used instead of script.bash as only its output is provided.
        # awk cut down any space and remove : separator from each line.
        done <<< "$(cat "${filename}" | awk -F ":" '{gsub(/ /,"");print $1 " " $2}')"

        let x++
done

Files:

f1.bin <-- newer

UD  :   JJ533
ID :   117
Ver :   8973
Time:   15545

f2.bin <-- older

UD  :   ZZ533
ID :   118
Ver :   9324
Time:   15548

Output:

--- file #1 ---
Line1: UD,ZZ533
Line2: ID,118
Line3: Ver,9324
Line4: Time,15548
--- file #2 ---
Line1: UD,JJ533
Line2: ID,117
Line3: Ver,8973
Line4: Time,15545

Upvotes: 1

Related Questions