Reputation: 22460
I am trying to write a bash script to list the size of each file/subdir of the current directory, as follows:
for f in $(ls -A)
do
du -sh $f
done
I used ls -A
because I need to include hidden files/dirs starting with a dot, like .ssh
. However, the script above cannot handle spaces if the file names in $f
contain spaces.
e.g. I have a file called:
books to borrow.doc
and the above script will return:
du: cannot access `books': No such file or directory
du: cannot access `to': No such file or directory
du: cannot access `borrow.doc': No such file or directory
There is a similar question Shell script issue with filenames containing spaces, but the list of names to process is from expanding *
(instead of ls -A
). The answer to that question was to add double quotes to $f
. I tried the same, i.e., changing
du -sh $f
to
du -sh "$f"
but the result is the same. My question is how to write the script to handle spaces here?
Thanks.
Upvotes: 0
Views: 3095
Reputation: 2376
Time to summarize. Assuming you are using Linux, this should work in most (if not all) cases.
find -maxdepth 1 -mindepth 1 -print0 | xargs -r -0 du -sh
Upvotes: 0
Reputation:
Try this:
ls -A |
while read -r line
do
du -sh "$line"
done
Instead of checking for the ls -A
output word by word, the while loop checks line by line.
This way, you don't need to change the IFS variable.
Upvotes: 0
Reputation: 25409
I generally prefer using the program find
if a for
loop would cause headaches. In your case, it is really simple:
$ find . -maxdepth 1 -exec du -sh '{}' \;
There are a number of security issues with using -exec
which is why GNU find
supports the safer -execdir
that should be preferred if available. Since we are not recursing into directories here, it doesn't make a real difference, though.
The GNU version of find
also has an option (-print0
) to print out matched file names separated by NUL bytes but I find the above solution much simpler (and more efficient) than first outputting a list of all file names, then splitting it at NUL bytes and then iterating over it.
Upvotes: 1
Reputation: 753475
Unless the directory is so big that the list of file names is too big:
du -sh * .*
Be aware that this will include .
and ..
, though. If you want to eliminate ..
(probably a good idea), you can use:
for file in * .*
do
[ "$file" = ".." ] && continue
du -sh "$file" # Double quotes important
done
You can consider assigning the names to an array and then working on the array:
files=( * .* )
for file in "${files[@]}"
do
...
done
You might use variations on that to run du
on groups of names, but you could also consider using:
printf "%s\0" "${files[@]}" | xargs -0 du -sh
Upvotes: 2
Reputation: 63892
Dont parse the output from ls
. When the file contains a space, the $f
contains the parts of teh filename splitted on the space, and therefore the double quotes doesn't got the whole filename
The next will work and will do the same as your script
GLOBIGNORE=".:.." #ignore . and ..
shopt -s dotglob #the * will expand all files, e.g. which starting with . too
for f in *
do
#echo "==$f=="
du -sh "$f" #double quoted (!!!)
done
Upvotes: 3