Reputation: 1434
I have a tool which inserts/updates thousands of records per hour in a database. It reads the input into a dataTable
and runs an SQL command row-by-row:
for(int i = 0; i < dataTable.Rows.Count; i++)
{
string sqlConnectionString = "server, db, user, pass, etc.";
SqlConnection sqlDBConnection = new SqlConnection(sqlConnectionString);
string query = @"INSERT INTO table
(col1, col2, col3, etc.)
VALUES
(@col1, @col2, @col3, etc.)";
SqlCommand queryCommand = new SqlCommand(query, sqlDBConnection);
queryCommand.Parameters.Add("@col1", SqlDbType.Int);
queryCommand.Parameters["@col1"].Value = Convert.ToInt32(dataTable.Rows[i][0]);
queryCommand.Parameters.Add("@col2", SqlDbType.VarChar);
queryCommand.Parameters["@col2"].Value = dataTable.Rows[i][1].ToString();
queryCommand.Parameters.Add("@col3", SqlDbType.VarChar);
queryCommand.Parameters["@col3"].Value = dataTable.Rows[i][2].ToString();
sqlDBConnection.Open();
queryCommand.ExecuteNonQuery();
sqlDBConnection.Close();
}
It's working perfectly, but it is slow. Do you have a quicker solution?
Upvotes: 0
Views: 104
Reputation: 69759
Since your data is already in a DataTable I think the best way is to use SQLBulkCopy. e.g.
string sqlConnectionString = "server, db, user, pass, etc.";
using (var bulkCopy = new SqlBulkCopy(sqlConnectionString))
{
bulkCopy.DestinationTableName = "table";
bulkCopy.ColumnMappings.Add("Col1", "Col1");
bulkCopy.ColumnMappings.Add("Col2", "Col2");
bulkCopy.ColumnMappings.Add("Col3", "Col3");
bulkCopy.WriteToServer(dataTable);
}
For updating existing records, if you have filled your DataTable using an SqlDataAdapter you can use the SqlDataAdapter.Update()
method. If not, then I would recommend handling the upsert in SQL. Since you are using SQL Server 2012 you can use Table Valued Parameter. The first step is to create your Table type
CREATE TYPE dbo.YourTableType TABLE
(
Col1 INT,
Col2 INT,
Col3 INT
);
This should have the same definition as your c# DataTable.
Then next step would be to create a stored procedure that accepts this type as a parameter, and uses MERGE
to insert or update records depending on whether or not there is a match:
CREATE PROCEDURE dbo.UpsertYourTable @Table dbo.YourTableType READONLY
AS
BEGIN
MERGE dbo.YourTable WITH (HOLDLOCK) AS t
USING @Table AS s
ON s.Col1 = t.Col1 -- OR HOWEVER YOU IDENTIFY EXISTING RECORDS
WHEN MATCHED THEN UPDATE
SET Col2 = Col2,
Col3 = Col4
WHEN NOT MATCHED THEN
INSERT (Col1, Col2, Col3, Col4)
VALUES (s.Col1, s.Col2, s.Col3, 'Test');
END;
GO
Finally, to call this from c# you would use:
string sqlConnectionString = "server, db, user, pass, etc.";
using (var connection = new SqlConnection(sqlConnectionString))
using (var command = new SqlCommand("dbo.UpsertYourTable", connection))
{
command.CommandType = CommandType.StoredProcedure;
var tvp = new SqlParameter("@Table", SqlDbType.Structured);
tvp.Value = dataTable;
tvp.TypeName = "dbo.YourTableType";
command.Parameters.Add(tvp);
command.ExecuteNonQuery();
}
Upvotes: 1
Reputation: 2230
Have you looked into doing with with SSIS? or via Bulk Insert
. Regardless of what option you choose, you want to minimise the number of connections going to the db - calling once several thousand times is going to be intensive.
Upvotes: 0
Reputation: 619
This is slow because every insert is treatet on its own which causes an overhead on transaction time. Try to do bulk insert by using different methods. Whether on framework (like spring-data bulk) or just by changing your query to insert multiple values at one shot.
insert into table (col1, col2, col3, ...)
values (val1, val2, cal3, ...),
(val1, val2, cal3, ...),
(val1, val2, cal3, ...), ...
but dont concat all of them to not overseed the buffersize of the db-connection.
Upvotes: 0
Reputation: 15677
Use either Bulk insert with SqlBulkCopy Class or Table Valued Parameters in C#.
That should speed things up
Upvotes: 1