Reputation: 9753
We have this db application which continually runs a complex query with different parameters. It's a compiled stored procedure (C#) which calls 4 normal stored procedures who join 4 to 6 tables together and the C# code merging the results in between calls.
This has been running fine (fast) for 6 months until the database was restored. Then the performance changed from almost instant to 20+ seconds. Googling it up made it clear that update statistics had to be run on the tables and yes it worked, instant query again.
But then the problem came back after a month. Ok no problem, "update statistics" again. Fast again. Then it took a week to come back. slow query out of the blue. And then the problem kept coming back faster and faster and at this point updating statistics does not help at all anymore.
How to analyze this kind of problem? Is there something to know about compiled stored procedures and query plans in terms of performance?
Upvotes: 1
Views: 241
Reputation: 4001
You should go "divide et impera" :).
How are those 4 normal stored procedures doing outside the compiled SP? Can you run them individually? Is the Execution Plan for each of them looking good? Are you using indexes for each of them? (correct use of Indexes should translate to Index Seek
operations in the execution plan, ideally Clustered Index Seek
operations)
If those look good then check out the compiled stored procedure. Look for any possible bottlenecks, especially when joining large amounts of data.
Personally I also agree that this could be fixed with the proper indexes in place.
When indexes are missing, SQL Server is trying to optimize the execution of code based on statistics. The stats can run out of sync with the data, so when they are not up-to-date then SQL Server can inefficiently run your code.
But when you define indexes, you are basically telling SQL Server how to optimally execute your query.
Upvotes: 2
Reputation: 783
performance tuning should be handle in steps, first try to tune query according to Table and existing indexes on tables. This can be done by proper joins, avoid wildcards, union etc etc, may be by using execution plan. If you are ready with your query, you can check if creating further indexes on column will boost the query performance. Keep in mind new indexes may increase performance for 'SELECT' query but can reduce performance for UPDATE,INSERT, DELETE.
Upvotes: 0
Reputation: 4866
Assuming MSSQL, under Database Properties; Options, check that "Auto Update Statistics" is True. It would be rare if it is set to False but it is a possibility.
Upvotes: 0
Reputation: 7042
I think the table needs indexes. Create indexes on the tables. Use SQL Server Profiler and another tool that I can't remember that will analyze the queries and tell you whether a index optimization/creation is needed.
Indexes should take care of slow queries. Also there are other things in the query that should be looked at like few tables, joins on primary ids, etc.
Upvotes: 3