Alex D
Alex D

Reputation: 739

Speed up update of 185k rows in SQL Server 2008?

I have a binary file with about 185k rows in it. C# parses file in seconds. What would be the best way to update MSSQL table with that data?

What I've tried:

  1. Easiest way - read binary row, parse, update table. The whole process takes around 2 days to update all the data.
  2. Combine 200 update queries and send them at once to the MSSQL. In this case, update takes 8 to 10 hrs.
  3. Combine 500+ queries into single one. Works faster, but drops timeout exceptions time to time, so some updates are not going through.

Any advice on how to speed up the update process?

Upvotes: 4

Views: 3810

Answers (3)

user166390
user166390

Reputation:

Use SqlBulkCopy (to a temporary table) followed by MERGE.

The bulk-copy method can efficiently transfer data using a "push" from a .NET client. Using BULK INSERT requires a "pull" from the server (along with the required local-file access).

Then the MERGE command (in SQL Server 2008+) can be used to insert/update/upsert the data from the temporary table to the target table according to the desired rules. Since the data is entirely in the database at this point, this operation will be as fast as can be. Using MERGE may also result in performance advantages over many individual commands, even those within the same transaction.

See also:

Upvotes: 1

pkExec
pkExec

Reputation: 2076

I would try table-valued parameter:

http://www.codeproject.com/Articles/22392/SQL-Server-2008-Table-Valued-Parameters

Upvotes: 1

Femi
Femi

Reputation: 64700

Not sure you really want to do it via C#: probably want to use BULK INSERT and give it a file with your data properly formatted.

Upvotes: 2

Related Questions