Avba
Avba

Reputation: 15266

node.js how to handle fast producer and slow consumer with backpressure

I'm very novice in node.js and don't understand the documentation about streams. Hoping to get some tips.

I'm reading a very large file line, and then for each line I'm calling an async network api.

Obviously the local file is read much faster than the async calls are completed:

var lineReader = require('readline').createInterface({
  input: require('fs').createReadStream(program.input)
});

lineReader.on('line', function (line) {
    client.execute(query, [line], function(err, result) {
        // needs to pressure the line reader here
        var myJSON = JSON.stringify(result);
        console.log("line=%s json=%s",myJSON);
    });
});

What is the way to add back pressure in the "execute" method?

Upvotes: 1

Views: 1428

Answers (1)

Avba
Avba

Reputation: 15266

The solution is to wrap the async behavior in a stream writer and throttle the async reader from within the writer:

val count = 0;
var writable = new stream.Writable({
    write: function (line, encoding, next) {
        count++;
        if (count < concurrent) {
            next();
        }

        asyncFunctionToCall(...) {
            // completion callback
            // reduce the count and release back pressure
            count--;
            next();
            ...
      }
});

var stream = fs.createReadStream(program.input, {encoding: 'utf8'});
stream = byline.createStream(stream);
stream.pipe(writable);

Upvotes: 2

Related Questions