Tom
Tom

Reputation: 1234

TSQL Merge Performance

Scenario:

I have a table with roughly 24 million records. The table has pricing history related to individual customers and is computed daily. There are on average 6 million records for each day. Every morning a the price list is generated and a merge statement is ran to reflect the changes in their pricing.

The merge statement begins with the previous day's previous data being inserted into a variable table, that table is then merged into the actual table. The main problem is that the merge statement takes pretty long.

My real question centers around the performance of using a variable table vs physical table vs temp table. What is the best practice for large merges like this?

Upvotes: 0

Views: 1899

Answers (3)

acermate433s
acermate433s

Reputation: 2554

Check if you indexes on the tables. Indexes would be updated every time you add/delete records on that table.

Try removing the indexes before merging the records and then re-create it again after the merge.

Upvotes: 0

HLGEM
HLGEM

Reputation: 96658

Temp tables often perform better than table variables for large data sets. Additionally you can put the data into the temp table and then index it.

Upvotes: 0

gbn
gbn

Reputation: 432742

Thoughts

I'd consider a temp table: these have statistics which will help. A table variable is always assumed to have one row. Also, the IO can be shunted onto separate drives (assuming you have tempdb separately)

If a single transaction is not required, I'd split the MERGE too into a DELETE, UPDATE, INSERT sequence to reduce the amount of work needed in each action (which reduces the amount of rollback info needed and the amount of locking etc

Upvotes: 2

Related Questions