Hellvetia
Hellvetia

Reputation: 345

Performance Dynamic SQL vs Temporary Tables

I'm wondering if copying an existing Table into a Temporary Table results in a worse performance compared to Dynamic SQL.

To be concrete i wonder if i should expect a different performance between the following two SQL Server stored procedures:

    CREATE PROCEDURE UsingDynamicSQL
    (
      @ID INT ,
      @Tablename VARCHAR(100)
    )
 AS
    BEGIN
        DECLARE @SQL VARCHAR(MAX)
        SELECT  @SQL = 'Insert into Table2 Select Sum(ValColumn) From '
                + @Tablename + ' Where ID=' + @ID
        EXEC(@SQL)
    END

    CREATE PROCEDURE UsingTempTable
        (
          @ID INT ,
          @Tablename Varachar(100)
        )
    AS
        BEGIN
            Create Table #TempTable (ValColumn float, ID int)
            DECLARE @SQL VARCHAR(MAX)
            SELECT  @SQL = 'Select ValColumn, ID From ' + @Tablename
                    + ' Where ID=' + @ID
            INSERT  INTO #TempTable
                    EXEC ( @SQL );
            INSERT  INTO Table2
                    SELECT  SUM(ValColumn)
                    FROM    #TempTable;
            DROP TABLE #TempTable;
        END

I'm asking this since I'm currently using a Procedure build in the latter style where i create many Temporary Tables in the beginning as simple extracts of existing Tables and am afterwards working with these Temporary Tables. Could I improve the performance of the stored procedure by getting rid of the Temporary Tables and using Dynamic SQL instead? In my opinion the Dynamic SQL Version is a lot uglier to programm - therefore i used Temporary Tables in the first place.

Upvotes: 0

Views: 1138

Answers (3)

Mike M
Mike M

Reputation: 1425

I agree with everyone else that you always need to test both... I'm putting it in an answer here so it's more clear.

If you have an index setup that is perfect for the final query, going to temp tables could be nothing but extra work.

If that's not the case, pre-filtering to a temp table may or may not be faster.
You can predict it at the extremes - if you're filtering down from a million to a dozen rows, I would bet it helps.
But otherwise it can be genuinely difficult to know without trying.

I agree with you that maintenance is also an issue and lots of dynamic sql is a maintenance cost to consider.

Upvotes: 0

paparazzo
paparazzo

Reputation: 45096

In the posted query the temporary table is an extra write.
It is not going to help.

Don't just time a query look at the query plan.
If you have two queries the query plan will tell you the split.

And there is a difference between a table variable and temp table
The temp table is faster - the query optimizer does more with a temp table

A temporary table can help in a few situations

  1. The output from a select is going to be used more than once
    You materialize the output so it is only executed once
    Where you see this is with a an expensive CTE that is evaluated many times
    People of falsely think a CTE is just executed once - no it is just syntax

  2. The query optimizer need help
    An example
    You are doing a self join on a large table with multiple conditions and some of conditions eliminate most of the rows
    A query to a #temp can filter the rows and also reduce the number of join conditions

Upvotes: 0

Dave.Gugg
Dave.Gugg

Reputation: 6771

Table variables suffer performance problems because the query optimizer always assumes there will be exactly one row in them. If you have table variables holding > 100 rows, I'd switch them to temp tables.

Using dynamic sql with EXEC(@sql) instead of exec sp_executesql @sql will prevent the query plan from being cached, which will probably hurt performance.

However, you are using dynamic sql on both queries. The only difference is that the second query has the unnecessary step of loading to a table variable first, then loading into the final table. Go with the first stored procedure you have, but switch to sp_executesql.

Upvotes: 1

Related Questions