Ved Prakash
Ved Prakash

Reputation: 1

Inserting large amount of data into SQL Server 2005 table

I am inserting around 2 million records into a SQL Server 2005 table. The table currently have clustered as well as non-clustered indexes. I want to improve the performance of the insert query in that table . Can anyone have idea about

Upvotes: 0

Views: 1770

Answers (5)

paparazzo
paparazzo

Reputation: 45096

You can insert up to 1000 rows in one insert.

values (a,b,c), (d,f,h)

Sort the the data via the primary key on the insert.

Use with (hold lock)

Upvotes: -2

Alok
Alok

Reputation: 274

You should drop the indexes and then insert data and then recreate indexes.

Upvotes: 0

user806549
user806549

Reputation:

If there is no good reason you aren't using bulk-insert, I'd say that your best option is to do this. Ie: Select rows into a format you can then bulk re-insert.

By doing ordinary inserts in this amount, you are putting a huge strain on your transaction logs.

If bulk-insert is not an option, you might win a little bit by splitting up the inserts into chunks - so that you don't go row-by-row, but don't try to insert and update it all in one fell swoop either.

I've experimented a bit with this myself, but haven't had the time to get close to a conclusive answer. (I've started the question Performance for RBAR vs. set-based processing with varying transactional sizes for the same reason.)

Upvotes: 1

Oleg Dok
Oleg Dok

Reputation: 21766

  1. Drop all the indexes (including primary if your data for insert are not preordered with the same key)
  2. Insert the data
  3. recreate all the dropped indexes

Upvotes: 2

Willem Meints
Willem Meints

Reputation: 1182

You can try to disable the indexs on the table before inserting and enabling them again after. It can be a huge timesaver if you're inserting large amounts of data into a table.

Check out this article for SQL server on how to do such a thing: http://msdn.microsoft.com/en-us/library/ms177406.aspx

Upvotes: 1

Related Questions