tldr
tldr

Reputation: 12112

using async.js with node.js streams

I want to use asyn.js for limiting the number of parallel io operations. I have come across the following example:

async.forEachLimit items, 5, ((item, next) ->
  request item.url, (error, response, body) ->
    console.log body
    next error)
    , (err) ->
        throw err  if err
        console.log "All requests processed!"

But I want to use it with streams, like this:

async.forEachLimit items, 5, ((item, next)->
    stream = fs.createWriteStream file
    request.get(item.url).pipe(stream))
    , (err)->
          throw err  if err
          console.log "All requests processed!"

How do I place the 'next' call when the writestream is done writing to file?

Upvotes: 3

Views: 2537

Answers (1)

Jonathan Lonowski
Jonathan Lonowski

Reputation: 123463

You'll need to bind to the Readable Stream's 'end' event separate from the .pipe().

res = request.get(item.url)
res.pipe(stream)
res.on 'end', next

This also lets you bind to its 'error' event:

res.on 'error', next

But, you could also listen to the Writable Stream's 'finish' event:

request.get(item.url)
    .on 'finish', next

Upvotes: 3

Related Questions