Craden
Craden

Reputation: 145

Split up text and process in parallel

I have a program that generates lots (terabytes) of output and sends it to stdout.

I want to split that output and process it in parallel with a bunch of instances of another program. It can be distributed in any way, as long as the lines are left intact.

Parallel can do this, but it takes a fixed number of lines and restartes the filter process after this:

./relgen | parallel -l 100000 -j 32 --spreadstdin ./filter

Is there a way to keep a constant number of processes running and distribute data among them?

Upvotes: 7

Views: 803

Answers (1)

Ole Tange
Ole Tange

Reputation: 33685

-l is no good for performance. Use --block instead if possible.

You can have the data distributed roundrobin with: --roundrobin.

./relgen | parallel --block 3M --round-robin -j 32 --pipe ./filter

Upvotes: 2

Related Questions