Jon
Jon

Reputation: 2133

Pipe output to two different commands not interlaced

Using techniques mentioned here (Pipe output to two different commands) we can split a stdout into multiple processes.

expensive_command | tee >(proc_1) >(proc_2) | proc_3

my problem is this interlaces the output.

Is there a way to copy the stdout but force proc_2 to block until proc_1 finishes?

I'm thinking something like

expensive_command | tee >(proc_1) | wait for EOF | tee >(proc_2) ...

Upvotes: 0

Views: 127

Answers (2)

chepner
chepner

Reputation: 531888

You can use a fifo as a cheap lock. Have proc1 write to it after it completes, and wait until a read from the fifo succeeds before running proc2.

mkfifo cheap_lock
expensive_command | tee >(proc1; echo foo > cheap_lock) \
                        >(read < cheap_lock; proc2 ) | proc3

(Of course, it's your responsibility to ensure that no other processes try to read from or write to cheap_lock.)

Upvotes: 2

konsolebox
konsolebox

Reputation: 75558

You can create a buffer holder that would release the output once data from input reaches eof like

expensive_command | awk '{ a[i++] = $0 }END{for (i = 0; i in a; ++i) { print a[i] | "tee temp.txt" } }'

Only that awk does not support process substitution.

In bash you can do:

readarray -t lines <(expressive_command | tee >(proc_1))
printf '%s\n' "${lines[@]}" | tee >(proc_2)

Depending on the peak data size of your output from expressive_command or version of your Bash, the command may require adjustments. You can also consider using another language.

Add: You can also use stdbuf. It runs command with modified buffering operations for its standard streams.

Upvotes: 2

Related Questions