Reputation: 961
We all know that NodeJs can handle way more requests than normal Java server, but how does one maintain a FIFO kind of structure for incoming requests?
Lets say a flash sale, where thousands of requests bombard in short amount of time to buy something which is of limited quantity??
How do we decide which request gets to get the product first and change the status of the product from available
to sold out
(just an example)?
Thanks
Upvotes: 3
Views: 2139
Reputation: 707328
In node.js, only one request actually runs at a time (the Javascript interpreter is single threaded) until you hit some sort of async operation in native code and then another request gets to start running while the other one is waiting for I/O. If there is some limited resource that all the requests are after, it is just a race to see which request gets far enough through your code to get the resource. If you cluster your server for added scalability, then each cluster runs one Javascript thread.
If you made each incoming request wait in a queue until all other requests that came before it were completely done (something that could be done), then you'd seriously wreck the scalability of your node.js server and most of the time it would be sitting idle waiting for some I/O operation to be done so it seems unlikely that that is the correct design.
Lets say a flash sale, where thousands of requests bombard in short amount of time to buy something which is of limited quantity??
How do we decide which request gets to get the product first and change the status of the product from available to sold out (just an example)?
The usual scheme here is to just let the first request that gets through to claim the resource have it (even though multiple resources may be running at the same time). Whether this is always the request that first arrived at your server or not is not going to be known, but it will be close and the user community is unlikely to know if it happened to be off by a few milliseconds just due to the variance in processing speed of two requests.
You will have to make sure your code that accesses shared resources (like databases) is safe for concurrency and does not make any troublesome assumptions about shared data.
Upvotes: 6