Reputation: 43884
I have an online now feature which requires me to set a field in my database which I have integrated into getting notification updates. As such this is being done via long polling (since short polling it isn't much better and this results in less connections to the server).
I used to do this on PHP but as those of you who know about PHP will understand, PHP will lose all it's available connections quite quickly, even under fpm.
So I turned to node.js which is supposed to be able to handle thousands, if not millions, of concurrent connections but the more I look it seems node.js handles these via event based programming. Of course event based programming has massive benefits.
This is fine for chat apps and what not but what if I have an online now feature that I have integrated into long polling to mark that a user is still online?
Would node.js still get saturated quickly or is it actually able to handle these open connections still?
Upvotes: 0
Views: 576
Reputation: 3137
Long Polling will eat up some of your connection pool, so be sure to set your ulimit
high if using a Linux or Unix variety.
Ideally you'll maintain state in something like memcached or redis. A prefered approach would be to use Redis. For this you'll subscribe to a pub/sub channel, and everytime the user state updates you'll publish an event. This will trigger a handler which will cause your long-poll to respond with the updated status/s. This is typically prefered to scheduling and much cleaner, but as long as you're not looping or otherwise blocking node's thread of execution you shouldn't see any problems.
As you're already using a PHP stack it might be prefered to not move away from that. PHP's(more so php-fpm) paradigm starts a process per connection, and these processes are set to timeout. So long polling isn't really an option.
Short polling on intervals can update the state on the front-end. As you specified that you are using cronjob, it might be cleaner to just hold the state in memory on the front-end and update it periodically.
This should work, however this might increase your need to scale earlier, as each user will be sending n more requests. However, this might be the easiest approach, and you're not adding unnecessary complexity to your stack.
Adding websockets for such a simple feature is likely overkill, and websockets themselves can only have a limited amount of connections(depending on your host and configurations) so you're not really solving any of the issues that long polling presents. If you don't plan to use websockets for more than just maintaining user state then you're adding another technology to your stack to solve a simple problem.
Upvotes: 2