Reputation: 301
I want to copy data between two tables with SqlBulkCopy
. So first I defined a statement that contains the where clause (where ID IN (...)) in my table that defines which entries should be copied.
(In this scenario, there are approximately 800,000 entries in the result set.)
My problem is that this copy process is taking a long time to complete and I want to optimize that. I guess that the high amount of string data, which is stored in a nvarchar(max) column, is causing the long run.
Is there anything I can improve to optimize the execution time.
My code looks like this:
using (SqlConnection source = source_connection)
{
source.Open();
SqlCommand source_command = source.CreateCommand();
source_command.CommandText = source_command_select_text;
source_command.CommandTimeout = 1200;
SqlDataReader reader = source_command.ExecuteReader();
using (SqlConnection destination = destination_connection)
{
destination.Open();
using (SqlBulkCopy bulkCopy = new(destination))
{
bulkCopy.DestinationTableName = destination_table;
bulkCopy.BulkCopyTimeout = 1200;
try
{
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
Log.Logger.Error(ex.ToString());
throw;
}
finally
{
reader.Close();
}
}
}
}
Upvotes: 0
Views: 1310
Reputation: 11347
Using the option SqlBulkCopyOptions.TableLock will increase the performance.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection, SqlBulkCopyOptions.TableLock))
{
// ...code...
}
Using a BatchSize will also increase the performance. You currently try to insert 800,000 entire at once which is way too much. Setting it to something like 10k should make it faster.
Upvotes: 0