shaolinmonk
shaolinmonk

Reputation: 31

How to most efficiently insert/update a few million rows in SQL Server?

I have a program that is reading through a text file to update/insert rows. I've tried the following pseudocode methods below. These are all extremely slow. I'm running this code directly on the SQL Server itself and these processes all take many hours to complete...

The amount of update/insert statements is somewhere in the millions. What is the most efficient way to run that many SQL statements in .net c#?

// Insert/Update while reading text file
While (reader.read)
{
   sql.ExecuteNonQuery();
}

Or...

// build a list to loop through later and insert/update
While (reader.read)
{
   List.Add(sql);
}

foreach(string s in sql)
{
   sql.ExecuteNonQuery();
}

Or...

// Build a list and run 1000 statements at one time
While (reader.read)
{
   List.Add(sql);
   if(List.Count == 1000)
   {
      sql.ExecuteNonQuery();
   }
}

Upvotes: 1

Views: 1320

Answers (1)

Habib
Habib

Reputation: 223402

Use SqlBulkCopy Class.

Lets you efficiently bulk load a SQL Server table with data from another source.

Microsoft SQL Server includes a popular command-prompt utility named bcp for moving data from one table to another, whether on a single server or between servers. The SqlBulkCopy class lets you write managed code solutions that provide similar functionality.

There are other ways to load data into a SQL Server table (INSERT statements, for example), but SqlBulkCopy offers a significant performance advantage over them. The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.

Upvotes: 5

Related Questions