user3791980
user3791980

Reputation: 445

pipe node streams in parallel with highland js

I have an array of async node streams that I want to consume in parallel is there a easy way to do this in highland js? I'm trying to have all these arrays go through an aggregation function.

Upvotes: 0

Views: 1794

Answers (1)

Stefano
Stefano

Reputation: 2186

Are you looking to do something like this:

const request = require('request');
const nodeStreams = [request('http://www.example.com'),
                     request('http://www.example1.com')];

hl(nodeStreams)
.map(hl)
.merge()
.map(JSON.parse)
.each(console.log);

I'll explain what's going on here. You start with an array of Node streams and you use map to convert them into Highland streams so you now have a stream of streams. The merge function then merges all the values into a single stream. All those streams will be read in parallel but map(JSON.parse) will only be called on them one at a time. If you want to paralellise more of the computation then you want to do as much with each of the streams before you call merge:

hl(nodeStreams)
.map(stream => hl(stream).map(JSON.parse))
.merge().each(console.log);

If the order of the requests matters then you should use parallel but this incurs a performance overhead because the function has to keep track of all the different streams.

Upvotes: 3

Related Questions