Reputation: 23517
#! /bin/sh
if [ $# -ne 2 ]
then
echo "Invalid argument count."
echo "Usage: $0 <dir1> <dir2>"
exit
fi
ls $1 >> dir1
for file in $2/*
do
grep $file $dir1
done
rm $dir1
I wrote the script above with the intention to print all files that are both in the two directories whose named are passed to the scripts as the arguments.
But when I ran the script, it took forever. (Never finished, actually!)
Does anyone happen to know what I did wrong here?
Thanks
Upvotes: 0
Views: 109
Reputation: 8802
$dir1 is not defined, so grep has only one argument and search $file in standard input. So grep is waiting input and it will wait forever.
Upvotes: 1
Reputation: 37298
I'd have to set up a test to be sure, but you never set a value for $dir1
, you only redirect ls
into a file named dir1
.
So how about
#! /bin/sh
if [ $# -ne 2 ]
then
echo "Invalid argument count."
echo "Usage: $0 <dir1> <dir2>"
exit
fi
ls $1 >> dir1
for file in $2/*
do
grep $file dir1
done
rm dir1
When I get a script that's just hanging with no obvious, expected output. I kill it and then add the shell debugging features to the script.
Either at the top, like
#!/bin/sh -vx
Or at any abitrary place in the script where I suspect a problem OR on the second line, i.e.
set -vx
OR if I'm pretty sure where the problem is, and the above produce a lot of stuff to scroll throught, then I isolate the suspect lines, turning debugging on and off i.e.
set -vx
suspect_code
set +vx
Finally, I think your problem can be reduced to
ls $1 >> dir1
ls $2/* | fgrep -f dir1
This has limitations if your files/dirs have spaces, CRs or other nasty stuff embedded. If you have nice clean filenames, it should work.
I hope this helps.
Upvotes: 0