Reputation: 1067
I've been planning out a Rails RSS aggregator lately, and I ran into something I could use some advice on. The part that would handle the polling and parsing of users' subscribed feeds needs to be running constantly, which I assume a daemon is probably the best option for. (I would use the Daemons gem and have the daemon periodically query the database for feeds in need of refreshing, then use Feedzirra to parse and save items.)
My question is: how would the daemon share the models and migrations from Rails, especially if the daemon were running on another server, should the app require it for scalability? (i.e. database server, feed crawler server, and instances of the front-end) I'm probably falling victim of "premature scaling," but as a Ruby newbie I'm interested in what the best way to handle this would be. For the sake of "doing it the right way" the first time.
Or am I going about this the wrong way?
Upvotes: 1
Views: 203
Reputation: 10592
As @house9 pointed out you should use DelayedJob for this (https://github.com/collectiveidea/delayed_job)
DJ is loading whole Rails env and is capable of running as a separate process even on separate server. That's the easiest way to go.
Upvotes: 1