Reputation: 163232
Suppose I have a readable stream, sourceStream
, that generates ~32KB of data every second. I want to pipe this data to the writable streams of multiple HTTP clients.
Client A has a great internet connection and can keep up with my stream data. Client B is connected on his/her trusty 14.4kbps US Robotics modem.
sourceStream.pipe(resClientA); // Pipe to Client A's writable stream
sourceStream.pipe(resClientB); // Pipe to Client B's writable stream
These are TCP connections and Node.js is going to be able to know when a client is getting behind. I know that there is a small amount of buffering built into Streams, but it won't be long before that buffer is full and Node.js will have to do something with all that data being produced by sourceStream
.
Will it buffer indefinitely for Client B while Client A continues to get data as expected? Or, will Node pause the stream until Client B can catch up, meaning that this stream is paused for Client A as well? Or, will something else happen?
Upvotes: 4
Views: 822
Reputation: 20315
It goes as slow as the slowest reader. Source: https://github.com/isaacs/stream-multiplexer#the-problem
Most of the time, you'd prefer that the reader goes no faster than the slowest writer can accomodate. In Node v0.10, this is how it works.
Upvotes: 5