Neil Middleton
Neil Middleton

Reputation: 22238

Using statistical tables with Rails

I'm building an app that needs to store a fair amount of events that the users carry out. (Think LOTS as in millions per month).

I need to report on the these events (total of type x in the last month, etc) and need something resilient and fast.

I've toyed with Redis etc to store aggregates of the data, but this could just mean that I'm building up a massive store of single figure aggregates that aren't rebuildable.

Whilst this isn't a bad solution, I'm looking at storing the raw event data in tables that I can then query on a needs basis, and potentially generate aggregate counters on a periodic basis. This would thus give me the ability to add counters over time, and also carry out ad-hoc inspections on what is going on, something which aggregates don't allow.

Question is, how is best to do this? I obviously don't want to have to create a model for each table (which is what Rails would prefer), so do I just create the tables and interact with raw SQL on a needs basis, or is there some other choice for dealing with this sort of data?

Upvotes: 0

Views: 115

Answers (1)

Morg.
Morg.

Reputation: 701

I've worked on an app that had that type of data flow and the solution is the following :

-> store everything -> create aggregates -> delete everything after a short period (1 week or somehting) to free up resources

So you can simply store events with rails, have some background aggregate creation from another fast script (cron sql), read with rails the aggregates and yet another background script for raw event deletion.

Also .. rails and performance don't quite go hand in hand usually ;)

Upvotes: 1

Related Questions