SimoneG
SimoneG

Reputation: 43

Refactoring DB with surrogate keys

We've a db with around 800 tables and 4000 between procedures and functions with a large use of composite keys. The question is: Would you change, piece after piece, the tables and all the procedures/functions to use surrogate keys insted of composite keys? Thanks for any suggestions.

The reason of this are: slowness in retrieving data, slowness in rebuilding indexes. But i'm not sure if these are a good reasons...

Upvotes: 0

Views: 91

Answers (1)

simon at rcl
simon at rcl

Reputation: 7344

I don't think that there is an easy answer here. (I am assuming that you have unique indexes - probably the PK - on all tables.) My approach to this would be to take a couple of identifiable-as-slow queries and look at the query plans, and try to speed up the queries without changing table structures, save adding indexes as may seem useful.

If this isn't producing results - i.e. your current structure is a good as you can make it but it's not good enough - then experiment with using surrogate PK's on the tables in the queries, adding the new PKs as FK's to the referring tables. Then repeat your tests and see what difference is made.

Making queries faster is a bit of a black art. My approach to it to start with the query plan and look at anything which involves a table scan or a full index scan (if the table only has 100 rows ignore it), and then try to restructure the query so that it has some way to limit the number of rows it must search. This can be quite difficult and you have to understand the data and the relationships very well, as well as being quite knowledgeable about SQL and the different ways you can write selects, sub-selects, and joins.

That's all a bit waffly, but I don't think that there a single hard-and-fast answer to this situation.

Upvotes: 1

Related Questions