Shnick
Shnick

Reputation: 1391

Loop through files in directory specified using argument

I'm trying to loop through files in a directory, where the directory is passed through as an argument. I currently have the following script saved in test.sh:

#!/bin/bash
for filename in "$1"/*; do
    echo "File:"
    echo $filename
done

And I am running the above using:

sh test.sh path/to/loop/over

However, the above doesn't output the files at the directory path/to/loop/over, it instead outputs:

File:
path/to/loop/over/*

I'm guessing it's interpreting path/to/loop/over/* as a string and not a directory. My expected output is the following:

File:
foo.txt
File:
bar.txt

Where foo.txt and bar.txt are files in the path/to/loop/over/ directory. I found this answer which suggested to add a /* after the $1, however, this doesn't seem to help (neither do these suggestions)

Upvotes: 2

Views: 7320

Answers (2)

F. Hauri  - Give Up GitHub
F. Hauri - Give Up GitHub

Reputation: 70792

Iterate over content of directory

Compatible answer (not only bash)

As this question is tagged , there is a POSIX compatible way:

#!/bin/sh

for file in "$1"/* ;do
    [ -f "$file" ] && echo "Process '$file'."
done

Will be enough (work with filenames containing spaces):

$ myscript.sh /path/to/dir
Process '/path/to/dir/foo'.
Process '/path/to/dir/bar'.
Process '/path/to/dir/foo bar'.

This work well by using any . Tested with bash, ksh, dash, zsh and busybox sh.

#!/bin/sh

cd "$1" || exit 1
for file in * ;do
    [ -f "$file" ] && echo "Process '$file'."
done

This version won't print path:

$ myscript.sh /path/to/dir
Process 'foo'.
Process 'bar'.
Process 'foo bar'.

Some ways

Introduction

I don't like to use shopt when not needed... (This change standard bash behaviours and make script less readables).

There is an elegant way for doing this by using standard bash, without requirement of shopt.

Of course, previous answer work fine under , but. There are some interresting way for making your script more powerfull, flexible, pretty, detailed...

Sample

#!/bin/bash

die() { echo >&2 "$0 ERROR: $@";exit 1;}            # Emergency exit function

[ "$1" ] || die "Argument missing."                 # Exit unless argument submitted

[ -d "$1" ] || die "Arg '$1' is not a directory."   # Exit if argument is not dir

cd "$1" || die "Can't access '$1'."                 # Exit unless access dir.

files=(*)                                           # All files names in array $files

[ -f "$files" ] || die "No files found."            # Exit if no files found

for file in "${files[@]}";do                        # foreach file:
    echo Process "$file"                            #   Process file
done

Explanation: considering globbing vs real files

When doing:

files=(/path/to/dir/*)

variable $files becomes an array containing all files contained under /path/to/dir/:

declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")

But if nothing match glob pattern, star won't be replaced and array become:

declare -p files
declare -a files=([0]="/path/to/dir/*")

From there. looking for $files is like looking for ${files[0]} ie: first field in array. So

[ -f "$files" ] || die "No files found."

will execute die function unless first field of array files is a file ([ -e "$files" ] to check for existing entry, [ -d "$files" ] to check for existing directory, ans so on... see man bash or help test).

But you could do replace this filesystem test by some string based test, like:

[ "$files" = "/path/to/dir/*" ] && die "No files found."

or, using array length:

((${#files[@]}==1)) && [ "${files##*/}" = "*" ] && die "No files found."

Dropping paths by using Parameter expansion:

For suppressing path from filenames, instead of cd $path you could do:

targetPath=/path/to/dir
files=($targetPath/*)
[ -f "$files" ] || die "No files found."

Then:

declare -p files
declare -a files=([0]="/path/to/dir/bar" [1]="/path/to/dir/baz" [2]="/path/to/dir/foo")

You could

printf 'File: %s\n' ${files[@]#$targetPath/}
File: bar
File: baz
File: foo

Upvotes: 4

tripleee
tripleee

Reputation: 189387

This would happen if the directory is empty, or misspelled. The shell (in its default configuration) simply doesn't expand a wildcard if it has no matches. (You can control this in Bash with shopt -s nullglob; with this option, wildcards which don't match anything are simply removed.)

You can verify this easily for yourself. In a directory with four files,

sh$ echo *
a       file    or      two

sh$ echo [ot]*
or      two

sh$ echo n*
n*

And in Bash,

bash$ echo n*
n*

bash$ shopt -s nullglob

bash$ echo n*

I'm guessing you are confused about how the current working directory affects the resolution of directory names; maybe read Difference between ./ and ~/

Upvotes: 2

Related Questions