Reputation: 56292
As far as I know, find . -maxdepth 1 -type f
is the only reliable way to get the list of files in a folder. However, naively putting the output of this command in a bash array (($(find . -maxdepth 1 -type f)
) is going to fail if some files have spaces in their names.
What is the correct way to do this?
Upvotes: 0
Views: 300
Reputation: 57418
You could do this in a cycle:
ARR=()
find . -maxdepth 1 -type f | while read filename; do
ARR+=("$filename")
done
UPDATE: unfortunately, the while version doesn't work because while runs in a subprocess, with no way of exporting back the results.
So apparently this must be done in a for loop, with no way to avoid manipulating IFS.
The correct answer is dogbane's. The above incorrect method is kept as a warning :-(
Upvotes: 1
Reputation: 274680
The safest way is to use find
with -print0
which will also handle filenames with newlines in them correctly. Loop over the files and store them in an array:
declare -a arr=()
while IFS= read -r -d $'\0' f
do
arr+=("$f")
done < <(find . -maxdepth 1 -type f -print0)
Test:
for i in "${arr[@]}"
do
echo "[$i]"
done
Upvotes: 4
Reputation: 19189
This is how I did it the last time I needed to allow spaces:
IFS=$'\n' #Setting the Internal field separator to \n instead of \t\n
array=($(find . -maxdepth 1 -type f))
Upvotes: 2