Reputation: 754418
I've been asked to troubleshoot performance problems in a SQL Server 2005 database.
The challenge is not a huge amount of data, but the huge number of tables. There are more than 30,000 tables in a single database. The total data size is about 650 GB.
I don't have any control over the application that creates all those tables. The application uses roughly 2,500 tables per "division" on a larger company with 10-15 divisions.
How do you even start to check for performance problems? All the articles you find on VLDB (Very Large DB) are about the amount of data, not the amount of tables.
Any ideas? Pointers? Hints?
Upvotes: 2
Views: 822
Reputation: 34909
As others have noted, the number of tables is probably indicative of a bad design, but it is far from a slam dunk that it is the source of the performance problems.
The best advice I can give you for any performance optimization is to stop guessing about the source of the problem and go look for it. Above all else, don't start optimizing until you have positively identified the source of the problem.
I'd start by running some traces on the database and identify the poor performing queries. This would also tell you which tables are getting used the most by the application. In all likelihood a large number of those tables are probably either: A) leftover temp tables; B) no longer used; or C) working tables someone didn't clean up.
Upvotes: 3
Reputation: 300539
Putting the poor DB design aside, if no users are reporting slow response times then you don't currently have a performance problem.
If you do have a performance problem:
1) Check for fragmentation (dbcc showcontig
)
2) Check the hardware specs, RAID/drive/file placement. Check the SQL server error logs. If hardware seems underspecified or poorly designed, run Performance counters (see PAL tool)
3) Gather trace data during a normal query work load and identify expensive queries (see this SO answer: How Can I Log and Find the Most Expensive Queries?)
Upvotes: 0
Reputation: 1482
Is the software creating all these tables? If so, maybe the same errors are being repeated over and over. Do all the tables have a primary key? Do they all have a clustered index? Are all the necessary non-clustered indexes present (those columns that are used for filtering and joins) etc etc etc.
Is upgrading the SQL Server 2008 an option? If so, you could take advantage of the new Policy Based Management feature to enforce best practice for this large amount of tables.
To start tuning now, I would use profiler to find those statements with the longest duration, then see what you can do to improve them (add indexes is usually the simplest way).
Upvotes: -1
Reputation: 161773
Start like any other kind of performance tuning. Among other things, you should not assume that the large number of tables constitutes a performance problem. It may be a red herring.
Instead, ask the users "what's slow"? Even if you measured the performance (using the Profiler, perhaps), your numbers might not match the perceived performance problem.
Upvotes: 6