Prakash
Prakash

Reputation: 555

Using temp tables in separate queries

I have a scenario where I have to insert into a table for thirteen different use cases. I am primarily getting the data from a "MainTable" and I have to only get records with "IsLocal" = 1 from this "MainTable".

I am just contemplating whether I should use the table directly with the "IsLocal" condition in all my thirteen different use cases or just use a temporary table and populate it with records from "MainTable" for the condition of "IsLocal" = 1. Which would be the better option to me?

This "MainTable" is expected to have around 1 million records with significant portion of them having "IsLocal=1".

Upvotes: 1

Views: 840

Answers (2)

Shushil Bohara
Shushil Bohara

Reputation: 5656

It's definitely better way to store frequently used data in temporary table and use. In your case store data from MainTable for the condition of IsLocal = 1. it will avoid to scan the whole table again and again for the particular set of data so its sure gain the performance which is noticeable. In addition I would like suggest you few things while following this approach:

1- Store data using INTO clause --Instead of INSERT INTO --It is much faster

SELECT a,b,c,......INTO #tmp_main_table FROM main_table

2- Index the columns in #tmp_main_table

Note: Storage and other issues are your own so be careful about that.

Upvotes: 0

gotqn
gotqn

Reputation: 43666

It basically depends on your business logic, your infrastructure and the table definition itself.

If you are storing data in temporary table, it is stored in the tempdb. So, the question is can we afford to store such amount of data in the tempdb without affecting the general performance.

What's the amount of data? If you are just storing one millions BIGINT values we might be OK. But if we are storing one millions rows and many nvarchar(max) values?

How big are is our tempdb and is it on ram disk? How often this temporary table is going to be populated? Once per day, or hundreds times every minute?

You need to think about the questions above and implement the solution. Then, after few days or weeks, you can find out that it was not good one and change it.


Without knowing your production environment details I can advice only that you can optimize your query using indexes. You are filtering by IsLocal = 1 - this seems to be a good match for filtering index (even most of the rows have this value, we are going to eliminate some of them on read).

Also, if you are getting only few of the columns from the table, you can try to create covering index of your query creating index with include columns. Having index with the columns we need and filtering predicate can optimize our query a lot. But, you have to test this again as creating the perfect index is not an easy task each time.

Upvotes: 1

Related Questions