Reputation: 24886
I have built a JSON service that uses two other services, each of which has some latency, and as a result of this dependency each request can take up to 800 ms to finish.
One of the major reasons I went with Node.js was for the evented IO. I want the new requests to be handled smoothly, and not hang, as the slow requests are being finished.
How can I test the service and get some reliable metrics on how my service will perform on Heorku -- which is hard to figure out because I don't want to hit my query limits on the other services I am dependent on. How many simultaneous requests will my service accept? What can I do to maximize this number?
In addition, I'd like to get some feedback on whether Node.js going to provide me with better performance in this scenario, over, say, Rails?
(Oh, and...is Heroku a good choice to deploy a service? Sorry..I'm new to Node. )
Thanks a lot.
Upvotes: 1
Views: 145
Reputation: 10864
First, you should not use memory session but use DB session, Memcached or Redis session for sharing states.
Second, you should separate delayed processes to worker process
and use worker queue to add and fetch those delayed jobs. You can use Redis as queue which is kue or using other messaging queues are good alternatives.
Third, be sure to be remained non-blocking on your node codes. process.nextTick
is the key.
You can test many simultaneous requests with Blitz which is one of add-ons on Heroku.
Upvotes: 2