Reputation: 25
I have a node.js project, which has a async::queue, which is filled by App::add()
.
This is the code I use to create the server:
app = new App config
server = http.createServer (req, res) ->
app.add req, res, (err) ->
res.headers = 'Content-Type': 'application/json'
res.end JSON.stringify error: err
.listen config.port
App::add()
takes the paramters from req
and adds a task to the queue with the callback.
The queue is processed with a function which ends end
. If an error occures, it will be given out to the user.
Works fine, so far. But when do multiple requests in parallel, only one request at the time is processed.
I tried:
console.log
at the end of the http.createServer
-callback and it was called immediately, but the next request is only processed after the first ended.console.log
at the beginning of App::add()
, same as aboveHow do I handle concurrent requests?
Thanks in advance.
Edit:
To clear some things up, this is what I want:
n
are processed at a timeEdit 2:
I tried to use the cluster module, but this doesn't help much, because I have to end the res
anyways.
Edit 3:
I tried to return a stream without closing, but no success either. I'm a bit desperate, maybe I'll just forward people to another server where they can download their file.
Upvotes: 2
Views: 247
Reputation: 289
NodeJS, like javascript in your browser, is single threaded. That means it can process only one continuous block of code at a time. In your example (since you haven't supplied a code for your queue processing, I assume that passed function is the actual processing code). Since it don't have any I/O (that means it don't block on waiting for something from external source like file system or database), the function's logic must fully complete before nodejs can start processing another event.
If you need more details, here's the first link I've googled about how nodejs works in concurrent environment: http://blog.mixu.net/2011/02/01/understanding-the-node-js-event-loop/
Upvotes: 1