Reputation: 589
I have nearly 7 billion rows of data in memory (list<T>
and sortedlist<T,T>
) in C#. I want to insert this data into tables in SQL Server. To do this, I define different SqlConnection
for each collection and set connection pool to False.
First, I tried to insert data with connected mode (ExecuteNonQuery
). Even I defined Parallel.Invoke
and called all insert methods for different collections concurrently, it is too slow and up to now I couldn't finish it (I couldn't discriminate any differences between sequential and concurrent insert).
Also, I tried to create an object from SqlDataTable
. To fill tables I read all data from collections once and add data to SqlDataTable
. In this case I set SqlBatchSize=10000
and SqlTimeOut=0
for SqlBulkCopy
. But this one also is very slow.
How can I insert a huge amount of data into SQL Server fast?
Upvotes: 2
Views: 520
Reputation: 171236
You already discovered SqlBulkCopy
but you say it is slow. This can be because of two reasons:
WriteToServer
(or just use bigger DataTables)You can't go faster than with bulk-import, though.
Upvotes: 1
Reputation: 4622
Look for 'BULK INSERT'. The technique is available for various RDBMS. Basically, you create a (text)file with one line per record and tell the server to consume this text file. This is the fastest approach I could think of. I import 50 million rows in a couple of seconds that way.
Upvotes: 1