Strapakowsky
Strapakowsky

Reputation: 2343

Apply a script to subdirectories

I have read many times that if I want to execute something over all subdirectories I should run something like one of these:

find . -name '*' -exec command arguments {} \;

find . -type f -print0 | xargs -0 command arguments

find . -type f | xargs -I {} command arguments {} arguments

The problem is that it works well with corefunctions, but not as expected when the command is a user-defined function or a script. How to fix it?

So what I am looking for is a line of code or a script in which I can replace command for myfunction or myscript.sh and it goes to every single subdirectory from current directory and executes such function or script there, with whatever arguments I supply.

Explaining in another way, I want something to work over all subdirectories as nicely as for file in *; do command_myfunction_or_script.sh arguments $file; done works over current directory.

Upvotes: 1

Views: 314

Answers (4)

John Lawrence
John Lawrence

Reputation: 2923

The examples that you give, such as:

find . -name '*' -exec command arguments {} \;

Don't go to every single subdirectory from current directory and execute command there, but rather execute command from the current directory with the path to each file listed by the find as an argument.

If what you want is to actually change directory and execute a script, you could try something like this:

STDIR=$PWD; IFS=$'\n'; for dir in $(find . -type d); do cd $dir; /path/to/command; cd $STDIR; done; unset IFS

Here the current directory is saved to STDIR and the bash Internal Field Separator is set to a newline so names won't split on spaces. Then for each directory (-type d) that find returns, we cd to that directory, execute the command (using the full path here as changing directories will break a relative path) and then cd back to the starting directory.

Upvotes: 1

Dennis Williamson
Dennis Williamson

Reputation: 360535

Instead of -exec, try -execdir.

It may be that in some cases you need to use bash:

foo () { echo $1; }
export -f foo
find . -type f -name '*.txt' -exec bash -c 'foo arg arg' \;

The last line could be:

find . -type f -name '*.txt' -exec bash -c 'foo "$@"' _ arg arg \;

Depending on what args might need expanding and when. The underscore represents $0.

You could use -execdir where I have -exec if that's needed.

Upvotes: 2

pizza
pizza

Reputation: 7640

If you want to use a bash function, this is one way.

work ()
{
  local file="$1"
  local dir=$(dirname $file)
  pushd "$dir"
    echo "in directory $(pwd) working with file $(basename $file)"
  popd
}
find . -name '*' | while read line;
do
   work "$line"
done

Upvotes: 1

kojiro
kojiro

Reputation: 77157

There may be some way to use find with a function, but it won't be terribly elegant. If you have bash 4, what you probably want to do is use globstar:

shopt -s globstar
for file in **/*; do
    myfunction "$file"
done

If you're looking for compatibility with POSIX or older versions of bash, you will be forced to source the file defining your function when you invoke bash. So something like this:

find <args> -exec bash -c '. funcfile;
    for file; do
        myfunction "$file"
    done' _ {} +

But that's just ugly. When I get to this point, I usually just put my function in a script on my PATH and live with it.

Upvotes: 1

Related Questions