Prophet60091
Prophet60091

Reputation: 609

How to pass variable (list of folders) to for-loop?

I'm new-ish to Linux, and a beginner to programming, but have been able to patch together a few lines of code here for passing a list of files to a for-loop, and here for doing the same with folders.

# Run search.sh from base folder; 
# Only interested in folders under base folder (e.g., baseFolder/FolderA)
# list all folders in base directory that match search parameters;
# cut out just the folder name; feed to variable DIR

 DIR=$(find . -name *MTL.txt | cut -d '/' -f2)

# echo $DIR = "FolderA FolderB FolderC"
# place that information in a for-loop

    for i in $DIR; do

      cd $DIR # step into folder

      # find specific file in folder for processing
      FILE=$(find -name *MTL | cut -d '/' -f2)

      # copy in a static file from another folder;
      # rename file based on file name found in previous step
      cp /baseFolder/staticfile.txt $FILE.new
      do more stuff

      cd .. # step out of directory

     done

The code completes fine for the first directory, but fails to move into subsequent directories. I'm guessing that one of my (many) problems is that I just can't pass a list of folder names to $DIR like I am. This should be pretty simple, but my foo is weak.

Please sensei, show me the way.

EDIT:

Changing "cd $DIR" to "cd $i" had the desired effect. Code now loops through all directories and performs operations correctly within each. -Thx to core1024 for flagging the above.

Upvotes: 4

Views: 4835

Answers (5)

pizza
pizza

Reputation: 7630

Your original code will break if directory name has embedded spaces. And if you have multiple files satisfying the find condition in a directory, you will also repeat the action. Alternative:

IFS='
'
declare -a DIR
declare -a FILE
DIR=($(find . -name '*MTL.txt' | cut -f2 -d '/' | uniq))
for x in "${DIR[@]}"
 do
   if [ -d "$x" ];  then
     pushd "$x"
        FILE=($(ls *MTL))
        for y in "${FILE[@]}"
          do
            cp /baseFolder/staticfile.txt  "$y"
            echo do more stuff
          done
     popd
   fi
 done

Upvotes: 0

core1024
core1024

Reputation: 1882

I am not expert, but I think that cd $DIR # step into folder should be cd $i # step into folder

Upvotes: 1

chepner
chepner

Reputation: 531055

The following is a little simpler, since it doesn't need to parse find output:

for subdir in *; do
  if [[ ! -d $subdir ]]; then
    continue
  fi

  cd $subdir
  for f in *MTL.txt; do
    cp /baseFolder/staticFile.txt $f.new
    # And whatever else
  done

  cd ..
done

Note that I only descend into each subdirectory once, where your original code would do it multiple times. I point this out in case that was your intention. Also, since your cut command prevented you from going more than one directory deep, I assumed that was your intention.

Depending on the other stuff you do, you may be able to avoid changing directories altogether. For example:

for f in $subdir/*MTL.txt; do
  cp /baseFolder/staticFile.txt $f.new
  # More stuff
done

Upvotes: 0

Frank Jackson
Frank Jackson

Reputation: 39

There is no need for a "For loop" though. The find command will already loop through all the items it finds.

there is a -exec option on the find command that passes every item "found" to a specific command.

e.g.

find . -name *MTL.txt -exec cp {} /basefolder \;

this will copy all files found to /basefolder

if you feel strongly about using a for loop you can also use output from a command as list to the loop.

e.g.

for MYVAR in `ls *.txt* `
do
    echo $MYVAR
done

Using cut is generally not a good idea unless you are formatting output or have a command that does not send its output to stdout.

Upvotes: 0

Skippy Fastol
Skippy Fastol

Reputation: 1775

cd .. # step out of directory

just steps up ONE LEVEL.

You need, before the loop, to store your "base directory" in a variable :

BASEDIR=`pwd`

Then, you'll perform

cd $BASEDIR # step out of directory

instead of your current

cd ..

Upvotes: 4

Related Questions