LordRofticus
LordRofticus

Reputation: 227

.NET SqlBulkCopy - can Null-able fields be excluded

I small app that uploads data from CSV file to the cloud. This app runs at on several servers and each of these servers produces such a file.

My app incrementally import data from this CSV file into a datatable and using SqlBulkCopy bulk load it to Azure SQL.

Currently the table in Azure SQL matches the CSV file layout.

Azure SQL Table

CREATE TABLE stg.EventLog(
EventLogID BIGINT PRIMARY KEY,
EventLogDate DATETIME NOT NULL,
EventName VARCHAR(255) NOT NULL,
EventType VARCHAR(50) NOT NULL    
)

However one server wants to modify their CSV file to contain a additional field. So I have to create the additional column and ensure it can accept a NULL value.

Update to the table will then be:

ALTER TABLE stg.EventLog ADD Comments VARCHAR(1024) NULL

My apps original sync code looks as follow:

bc = New SqlBulkCopy(cnn)

bc.DestinationTableName = "dbo.EventLog"
bc.BatchSize = 10000

bc.ColumnMappings.Add("LogID", "LogID")
bc.ColumnMappings.Add("LogDate", "LogDate")
bc.ColumnMappings.Add("EventName", "EventName")
bc.ColumnMappings.Add("EventType", "EventType")

bc.WriteToServer(dt)

Now here is the weird tricky part. I cant update the app on the other servers until end of Feb 2021 due to corporate red tape. But I have to update the server with the change in CSV file structure ASAP. This means I need to add the additional column mapping to the app, build and deploy it.

So my question is: Based on the .NET code above, will it still work on the other servers if the Destination Table contains a NULLable field that is not mapped with SqlBulkCopy?

Please let me know if more info is needed.

Upvotes: 0

Views: 344

Answers (1)

Charlieface
Charlieface

Reputation: 71298

Yes, it will work. But you need to make sure you don't use SqlBulkCOpyOPtions.KeepNulls, and the column must have a default. In this case you already have NULL. For NOT NULL columns, you need a default. See further here.

From MSDN docs for SqlBulkCopyColumnMapping class:

If the ColumnMappings collection is not empty, not every column present in the data source has to be specified. Those not mapped by the collection are ignored.


Incidentally, some tips for SqlBulkCopy:

  1. You must dispose it with using
  2. Test different batch sizes, it can make a difference to performance.
  3. SqlDataRecord is faster and use less client RAM, especially if you're streaming from an IEnumerable, but it can be more complex. If you already have a DataTable don't bother.
  4. Consider using SqlBulkCopyOptions.TableLock

Upvotes: 0

Related Questions