Imran Rizvi
Imran Rizvi

Reputation: 7438

Should I use Table Lock while inserting data using Multithreaded SqlBulkCopy

I am inserting data into table using SqlBulkCopy in parallel threads. various links on internet says TableLock is a good option to use SqlBulkCopy.

Data is huge and continuous.

does RowLock give better performance in multithreading? I am confused between the time taken by one thread to complete insertion will cause waiting other threads to wait if table lock is enabled whereas row locking should not make any effect for insertion.

Upvotes: 2

Views: 9244

Answers (1)

plyawn
plyawn

Reputation: 412

You should use a table lock; SqlBulkCopy has it's own table lock that is concurrent for multiple bulk copy insert operations (the Bulk update (BU) lock). Parallel threads will help up until you saturate network I/O. A rule of thumb (from my personal testing) is to use n consumers, where n is 2x the number of cores for the server. Experiment yourself. You most definitely do not want a row lock as that defeats a lot of the optimizations that the bulk operation provides.

Your best bets for improving performance are:

  1. keep your batch size as large as possible - you may need to increase the timeout of the operation to prevent failures on long runs (set timeout to zero for no limit). To do the entire set in one go set the batch size to zero. If you have a very large data set look at streaming directly from the source to bulk copy.

  2. drop any indices on the target table, then defragment and create the index when you're done the load. Combined with a table lock this should minimize the log overhead and improve performance. You also can't use TABLOCK and concurrent operations on a table with a clustered index.

Upvotes: 1

Related Questions