Reputation: 2518
I have a performance issue.
We have a table which goes in to a couple of million records and we are serving up a web page which hits this table 100's of times every second (a select). The query is covered by a non-clustered index (wrapped in an SP as well)
My question is for a specific scenario we want to improve the performance, would it be wise for us to create a view which would always yield 300 rows for this scenario and index the view and query the view, or will it make no difference if I query the existing 2M plus table with the covered query?
Upvotes: 1
Views: 1425
Reputation: 13486
Hi as per my understanding,here for a specific scenario which always yields a 300 rows,you are creating a view.So if this is a case its the best way to handle this.Also creating covering index on the selective columns on the view will definitely improve the performance.
Upvotes: 0
Reputation: 171178
You can create a view that matches your query as much as possible and index it. As I understand it you only want to apply a where clause. This will work and it will cut the runtime cost of the where to zero. It will also remove all IO of rows which don't qualify. This is a good idea.
However, you can just use a filtered (covering) index, which is much simpler.
Fetching a range from a covering index is as fast as it gets. It is incredibly fast. There is no improvement possible.
The real problem seems to be that you are querying the database 100's of times! Can't you fold those queries into one or very few bigger queries? You can use table valued parameters to submit multiple query inputs at once, so to speak.
Upvotes: 1