Reputation: 7424
I'm trying to optimize a compression service (my own) on a 104 CPU machine.
In order to do this I'm splitting up video files doing the following
ffmpeg -i test.MOV -threads 3 \
-vcodec copy -f segment -segment_time 00:05 \
-reset_timestamps 1 \
out%02d.MOV
Then I'm compressing each one
for f in ./*MOV; do ffmpeg -i "$f" "./compressed/${f##*/}"; done
But in order for this to be optimized I need to go over the files at the same time as it seems FFMPEG caps out at 2-3 threads.
I tried the following but it doesn't work.
for f in ./*MOV; do (trap 'kill 0' SIGINT; ffmpeg -i "$f" "./compressed/${f##*/}"); done
How can I do this in bash?
Upvotes: 0
Views: 140
Reputation: 14491
Using xargs parallel execution, possible to achieve the above without having to build job control (e.g. [[ $(jobs -p | wc -l) -ge $parallel_processes ]]
in above script) in bash.
ls ./*MOV | xargs -P4 -L1 sh -c 'ffmpeg -i $0 ./compressed/${0##/}'
Also, xargs will take care of properly cancelling outstanding jobs (e.g., ctrl/c, or similar).
You can get more fancy things done with 'parallel' - limiting concurrency based on actual load, etc.
Upvotes: 3