Reputation: 1
I'm trying to use SqlBulkCopy
for inserting bigger amounts of data fast. For that I populate a DataTable
with the data I want to insert (up to about 100,000 rows) and call the WriteToServer
method. Doing so either makes a timeout occur or, given the BulkCopyTimeout
property is set to 0, just does nothing.
This is the content of the method preparing the data and sending it of to the server.
using var bulkcopy = new SqlBulkCopy(ConnectionString, SqlBulkCopyOptions.TableLock)
{
DestinationTableName = $"[{databaseConfiguration.Schema}].[{DatabaseUtil.Convert(source.Name)}]",
BulkCopyTimeout = 0,
BatchSize = 100,
NotifyAfter = 50
};
bulkcopy.SqlRowsCopied += (_, _) => Log.Info("Inserted 50 rows");
var dataTable = new DataTable(bulkcopy.DestinationTableName);
foreach (var column in source.Columns)
{
var propertyName = DatabaseUtil.ConvertPropertyName(column.Name);
bulkcopy.ColumnMappings.Add(propertyName, propertyName);
// ...
// add column to dataTable with proper type
// ...
}
foreach (var content in contents)
{
var values = content.Items.Select(x => x.Data);
var dataRow = dataTable.NewRow();
dataRow.ItemArray = values.ToArray();
dataTable.Rows.Add(dataRow);
}
bulkcopy.WriteToServer(dataTable);
The target server is SQL Server 2019 running in Docker Container on a Linux machine. Testing this with SQL Server 2022 on Windows resulted in the same outcome.
Attaching a method to the SqlRowsCopied
event of the SqlBulkCopy
object and setting the NotifyAfter
property to 1 reveals that it never even tries to copy even one row of the given table.
Experimenting with different values for BatchSize
as well shrinking the size of the source data to less than 10 rows did nothing.
Upvotes: 0
Views: 318