AoumiS
AoumiS

Reputation: 35

One Large MySQL Table Vs 20 - 50 Smaller Tables?

I am starting a project where the performance(Speed) of each result from the queries are very important.

The Situation:

All Products, Current Day Pricing, Price Difference, Percentage Difference, Pervious Price

Date, Price, Price Difference, Percentage Difference, Pervious Price

Note: This Project is build on ExpressionEngine

My Main Question is whether I should make a table that keeps all the product information with a product ID and then have all the daily pricing for all the products in one table. Then all the monthly pricing for all product in the second table etc. for the different pricing since each type of pricing will only be use a type at a time.

Or. Should I make 20-30 tables that stores their own daily pricing? then use JOIN when querying

I have looked into a lot of similar questions and answer but I can't seem to find a close enough situation.

Upvotes: 2

Views: 190

Answers (4)

Neville Kuyt
Neville Kuyt

Reputation: 29619

Firstly, it's not entirely clear what you're asking - are you asking whether you should denormalize your data, or whether you should partition your data?

Secondly, the answer to both questions is "no". Or, more accurately "No, please, please no, in the name of all that is holy - NO!".

Your database is going to be tiny - 50 products with 365 price changes per year is still only 15K records. You're not even close to the point where a decently designed "traditional" database schema would be slow; both denormalization and partitioning create major maintenance overhead, make the application more brittle, and involve significant amounts of development time.

I'd strongly recommend building a normalized data model, populating it with lots of data (at least double your anticipated maximum), run the queries you expect to run, and then work out whether you even have a problem by measuring the performance of the queries. If you do, optimize the queries. If it's still to slow, buy bigger/better hardware. If it's still to slow, consider de-normalizing. If that's too slow, you're working for Facebook.

Upvotes: 0

Bill
Bill

Reputation: 141

Output 1. Any time you do a full table scan to return all rows from the data base it can become time consuming. However 20-50 products is chump change and the database should not have a problem. If you begin to add a lot of products you can always introduce pagination and display 100 or so products per page. A second approach is to just cache the home page for a set amount of time and refresh it periodically.

Output 2. As long as you have product Ids or titles indexed individual product look ups should be very quick.

20-30 tables sounds like hell to maintain and doing that many joins would be a nightmare. Don't try to optimize prematurely. Only focus on it when you identify a problem.

Upvotes: 1

NeverendeR
NeverendeR

Reputation: 47

adopt a parent child method. and identify the proper relationships of your tables. also the handling of your data will depend on what software you will used in your backend..

Upvotes: 0

Adriaan Stander
Adriaan Stander

Reputation: 166356

I will always lean towards the single table approach rather than a set of tables.

You should use indexing on the table, which will greatly improve performance.

The overhead to maitain your own rolled table partitioning far outweighs the single table approach.

You should have a single table per "entity".

So a table for product static data, a table for pricing data, etc..

Upvotes: 0

Related Questions