Reputation: 364
I have the following code..which actually inserts data into destination using SQLBulkCopy. This code is failing in source SQL server frequently because of deadlock. FYI the table that is being copied can be in use (I meant some inserts/selects would be running) while we perform bulk copy.
Is that causing the issue or the "TABLOCK" hint has anything to do? As per my understanding TABLOCK acquire only shared locks and shouldn't be a problem.
using (var reader = srcConnection.ExecuteReader($"select * from [{DatabaseName}].[{schemaName}].[{tableName}]"))
{
const SqlBulkCopyOptions bulkCopyOptions = SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers |
SqlBulkCopyOptions.KeepNulls | //Do not replace nulls with defaults in destination
SqlBulkCopyOptions.KeepIdentity;
//Use the identity values from source, do not generate identities in destination.
using (var bcp = new SqlBulkCopy(dstConnection.ConnectionString, bulkCopyOptions))
{
const int threeMinutes = 60*3;
bcp.BulkCopyTimeout = threeMinutes; //Timeout is for a single batch
bcp.BatchSize = 5000;
bcp.DestinationTableName = $"[{DestinationDatabaseName}].[{schemaName}].[{tableName}]";
bcp.EnableStreaming = true;
foreach (var col in table.Columns.Cast<Column>().Where(c => !c.Computed))
{
bcp.ColumnMappings.Add(col.Name, col.Name);
}
bcp.WriteToServer(reader);
}
}
Upvotes: 3
Views: 8254
Reputation: 17020
The bulk insert will need to insert rows into the table. Inserting rows requires exclusive locks. The exact locks acquired will depend on the concurrency model.
If you specify the TableLock
option, your process will attempt to acquire an exclusive table lock. This can definitely lead to deadlocks if your process first acquires a shared table lock, other process has shared row locks, and both processes try to upgrade their locks to exclusive locks.
There several ways to get more information about the deadlocks:
Upvotes: 7