roger
roger

Reputation: 9893

How to use substitution in xargs?

What I want to do is

  1. find all file with .txt extension
  2. cp them to .dat file

it could do like this:

for f in `find . -type f -name "*.txt"`; do cp $f ${f%.txt}.dat; done

I want to do this with xargs, I have tried this:

find . -type f -name "*.txt" | xargs -i cp {} ${{}%.txt}.dat

I go error like this:

bad substitution

About this, I have these questions:

  1. how to do the substitution rightly?
  2. I am curious about that xargs will do things parallel when for loop do things one by one?

Upvotes: 10

Views: 11355

Answers (4)

syntagma
syntagma

Reputation: 24324

  1. How to do the substitution rightly?

You cannot use substitution in the way you are trying to do because {} is not a bash variable (only part of xargs syntax), therefore bash cannot do substitution on it.

A better way to it would be to create a full bash command and provide it as and argument to xargs (e.g. xargs -0 -i bash -c 'echo cp "$1" "${1%.txt}.dat"' - '{}' - this way you can do bash substitution).

  1. I am curious about that xargs will do things parallel when for loop do things one by one?

Yes, for loop will do things sequently but by default xargs always will. However, you can use -P option of xargs to parallelize it, from xargs man pages:

   -P max-procs, --max-procs=max-procs
          Run up to max-procs processes at a time; the default is 1.  If max-procs is 0, xargs will run as many processes as possible at a time.  Use the -n option or the -L  option
          with  -P;  otherwise  chances are that only one exec will be done.  While xargs is running, you can send its process a

SIGUSR1 signal to increase the number of commands to run simultaneously, or a SIGUSR2 to decrease the number. You cannot increase it above an implementation-defined limit (which is shown with --show-limits). You cannot de‐ crease it below 1. xargs never terminates its commands; when asked to decrease, it merely waits for more than one existing command to terminate before starting another.

Please  note that it is up to the called processes to properly manage parallel access to shared resources.  For example, if

more than one of them tries to print to stdout, the ouptut will be produced in an indeterminate order (and very likely mixed up) unless the processes collaborate in some way to prevent this. Using some kind of locking scheme is one way to prevent such problems. In general, using a locking scheme will help ensure correct output but reduce performance. If you don't want to tolerate the performance difference, simply arrange for each process to produce a separate output file (or otherwise use separate resources).

Upvotes: 15

Shakiba Moshiri
Shakiba Moshiri

Reputation: 23784

xargs and other tools are not as flexible as Perl for this kind of stuff.

~ ❱ find . | perl -lne '-f && ($old=$_) && s/\.txt/.dat/g && print "$old => $_"'
./dir/00.file.txt => ./dir/00.file.dat
./dir/06.file.txt => ./dir/06.file.dat
./dir/05.file.txt => ./dir/05.file.dat
./dir/02.file.txt => ./dir/02.file.dat
./dir/08.file.txt => ./dir/08.file.dat
./dir/07.file.txt => ./dir/07.file.dat
./dir/01.file.txt => ./dir/01.file.dat
./dir/04.file.txt => ./dir/04.file.dat
./dir/03.file.txt => ./dir/03.file.dat
./dir/09.file.txt => ./dir/09.file.dat

then instead of print function use: rename $old, $_

With this one-liner you can rename anything you like


For forcing the xargs with parallel mode you should use -P like:

ls *.mp4 | xargs -I xxx -P 0 ffmpeg -i xxx xxx.mp3

converting all the .mp4 files to .mp3 in parallel. So if you have 10 mp4 then 10 ffmpeg is running simultaneously.

Upvotes: 0

Ole Tange
Ole Tange

Reputation: 33685

If you are unhappy about the bash -c '...' - construct, you can instead use GNU Parallel:

find . -type f -name "*.txt" -print0 | parallel -0 cp {} {.}.dat

Upvotes: 2

anubhava
anubhava

Reputation: 785058

You can use:

find . -type f -name "*.txt" -print0 |
xargs -0 -i bash -c 'echo cp "$1" "${1%.txt}.dat"' - '{}'

Upvotes: 3

Related Questions