user4432964
user4432964

Reputation:

Increase SQL Bulk Copy speed on Azure

I am working on a project where I have to move an on-premises application over to Azure. We have an upload utility that transfers about 150,000 records to the web app (MVC App). Unfortunately, I was getting timeout issues after I migrated to Azure. I made several changes including using SqlBulkCopy and Store Procedures instead of using SqlCommand. Now, the timeout issue has been resolve but the data upload is taking about 5mins to upload the 150,000 records into a table on Azure.

I am using a trial version on Azure, and my Database DTU is 20. Now, I would love to keep it at 20 because of the cost. I have a small budget That I am working with. Note, Database Size isnt a problem. I am well below the quota.

Any Suggestions on how I can decrease the time to insert those 150,000 records?

Code Sample

using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection))
                    {
                        bulkCopy.BulkCopyTimeout = 0;
                        bulkCopy.BatchSize = 10000;
                        bulkCopy.ColumnMappings.Add("Barcode", "Barcode");
                        bulkCopy.ColumnMappings.Add("SubCategory", "SubCategory");
                        bulkCopy.ColumnMappings.Add("ItemDescription", "ItemDescription");
                        bulkCopy.ColumnMappings.Add("CreateDate", "CreateDate");
                        bulkCopy.ColumnMappings.Add("RevisedDate", "RevisedDate");

                        bulkCopy.DestinationTableName = "Items";
                        try
                        {
                            bulkCopy.WriteToServer(dtTblData);
                            destinationConnection.Close();
                        }
                        catch (Exception ex)
                        {
                            this.Logs.Add(DateTime.Now.ToString() + ": " + ex.Message);
                        }
                    }
                }

FYI: During the insert operation the DTU for my database reaches 100%.

Upvotes: 0

Views: 1532

Answers (2)

Scott Chamberlain
Scott Chamberlain

Reputation: 127563

Besides the Table locking Jonathan mentioned, the only real way to increase performance is to increase the DTUs for the service.

However you don't need to leave the database on the higher setting forever, if this bulk load is a infrequent operation you could temporary raise the DTUs of the database, do your load, then lower the DTUs back down. You would only be billed at the higher rate for the time you where actually uploading.

You can change the database via code using the the Azure SDK and functions in the Microsoft.Azure.Management.Sql.DatabasesOperationsExtensions class and setting the RequestedServiceObjectiveId value with a higher tier objective (The 20 DTUs you are on now is a S1 objective, you could move up to a S2 (50 DTUs) during the bulk load) on the Database object you pass in to the update function.

Upvotes: 0

Jonathan Magnan
Jonathan Magnan

Reputation: 11337

Using the option SqlBulkCopyOptions.TableLock will increase the performance.

So if you can lock the table, you should without a doubt use it.

using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection, SqlBulkCopyOptions.TableLock))
{
    // ...code...
}

Outside of this configuration, there is not a lot of stuff you can do since you already use SqlBulkCopy. The bottle neck is your database performance that you cannot upgrade because of the budget.

Upvotes: 1

Related Questions