bergy
bergy

Reputation: 1485

Do dynamic tables make sense for my use case? Database/Model architecture

My Rails app allows users to setup a data feed (typically a REST API), and pulls in results at specific intervals to allow the user to later filter/sort/chart/export the data. An example could be pulling a stock price every 15 minutes and saving its value and a timestamp as a row in a table.

Since there could be many users with many feeds setup, I'm trying to determine the best way to handle all of this data in Rails.

I feel like I should stay away from one large mega table with a feed_id on each row since there could be millions and millions of rows very quickly (50 users with 5 feeds running every 15 minutes would be 25,000 rows per day). Will this get unwieldy too quickly or am I underestimating the power of Rails/Postgres? What is the limit?

Another option I came up with was giving each feed its own table – create a table when the feed is added and save the data there. In discussions I've read it seems like dynamic table creation is frowned upon except in special circumstances and I'm wondering if this one fits the mold.

The last option would be adding a second database - potentially NoSQL like MongoDB. I'd rather keep everything in one DB if possible but if that really will yield the best performance and reliability I'd give it a go.

I would love to hear people's experience and opinions in tackling something to this with Rails.

Upvotes: 1

Views: 78

Answers (1)

Boris Schegolev
Boris Schegolev

Reputation: 3701

25,000 rows per day makes about 10 million per year. In this case you're well within limits of PostgreSQL for many years. Stock prices are mostly numeric, so, if I were you, I'd have a simple SQL table for all this data. Just avoid extra-long rows (texts) and you should be fine.

In future you could further extend your solution with partitioning (i.e. monthly or yearly) or move older data to some archive.

Upvotes: 1

Related Questions