Lyndz
Lyndz

Reputation: 423

Read multiple files using bash script/shell script

I am new in shell/bash scripting. I want to extract data from multiple netcdf files using the bash or shell script. Each file contains a time series of temperature values. For example:

FileA.nc contains 20 20 21 22 23 24
FileB.nc contains 23 24 25 26 27 24
FileC.nc contains 21 20 19 18 22 23

I want to extract the values per file and merge the results of the three files. The output should look like this saved as a csv file.

A 20 20 21 22 23 24
B 23 24 25 26 27 24
C 21 20 19 18 22 23

What is the easiest way to do this? Many thanks in advance for the help.

Upvotes: 0

Views: 3256

Answers (3)

Chem-man17
Chem-man17

Reputation: 1770

You can proceed as-

for i in {A..C}
do 
echo -n "$i " >> master_file
cat File"$i".nc >> master_file
done 

Now you'll have a huge file with appended individual files. Now to make spaces into commas (if you want a .csv format)

sed -i 's/ /,/g' master_file

Upvotes: 3

James Brown
James Brown

Reputation: 37464

In AWK:

$ awk '{gsub(/^file|\.nc$/,"",FILENAME); print FILENAME,$0}' file*.nc
A 20 20 21 22 23 24
B 23 24 25 26 27 24
C 21 20 19 18 22 23

Upvotes: 1

sjsam
sjsam

Reputation: 21965

awk is your friend :

$ arry=( file{A..C}.nc ) # store all the filenames in an array
$ # Then feed all the files to awk like below
$ awk '{printf "%s %s\n",gensub(/file(.)\.nc/,"\\1","1",FILENAME),$0}' "${arry[@]}" >newfile
$ cat newfile 
A 20 20 21 22 23 24
B 23 24 25 26 27 24
C 21 20 19 18 22 23

Note

This requires you have [ gnu awk ] which I suppose you already have.

Upvotes: 2

Related Questions