motam79
motam79

Reputation: 3824

Running a shell script on several files as inputs

I have a shell command with the following format:

my_cmd -I file1.inp -O file1.out

Where some processing is done on file1.inp and the results are stored in file1.out

In my main directory, I have many files with the format: *.inp and I would like to run this command for all of them and the store the results to *.out. Can I only use shell script to achieve this?

Upvotes: 2

Views: 81

Answers (3)

chepner
chepner

Reputation: 530940

Using GNU parallel

parallel my_cmd -I {} -O {.}.out ::: *.inp

By default, this will jobs in parallel, one job per core. {} is an unchanged argument, {.} is the same argument minus its extension. The arguments are taken from the words that follow :::.

Upvotes: 4

hek2mgl
hek2mgl

Reputation: 157947

You can use a simple loop:

for file in *.inp ; do
    my_cmd -I "${file}" -O "${file%%.inp}.out"
done

${file%%.inp} is a so called parameter expansion. It will effectively remove the extension .inp from the input filename.


One thing (thanks Jean-François Fabre). If the folder does not contain any .inp files the above loop would run once with $file having the literal value *.inp. To avoid that you need to set the nullglob option:

shopt -s nullglob # set the nullglob option
for file in *.inp ; do
    my_cmd -I "${file}" -O "${file%%.inp}.out"
done
shopt -u nullglob # unset the nullglob option

Upvotes: 5

Ipor Sircer
Ipor Sircer

Reputation: 3141

ls *.inp| xargs -l1 -I % my_cmd -I % -O %.out

Upvotes: -3

Related Questions