Reputation: 627
When I want to insert huge amount of data into database, which way is more efficient? 1. insert data row by row by calling insert statement 2. create a user defined Table type in my database, and write a stored procedure to insert a DataTable into database. and why?
Upvotes: 4
Views: 1539
Reputation: 1998
The most efficient would be SQL Bulk Insert.
To improve the performance further you can use SqlBulkCopy to SQL Server in Parallel.
Upvotes: 2
Reputation: 6574
2 is much much faster, because anything that involves per-row operations (RBAR) is inherently slow. I did quite a lot of testing with table valued parameters and found them to be close to the performance of bulk inserts for batch sizes of up to 1,000,000 rows. However, that's purely anecdotal and your mileage may vary.
Also, I would advise against using a DataTable
and write something that streamed the rows more directly. I blogged about that here.
Upvotes: 0
Reputation: 9670
This will probably vary based on the specific workload.
We can only tell you here what options are available to you - it is down to you to measure how efficient each option is in your specific case.
Options:
SQLBulkCopy
INSERT
statements with single rowINSERT
statements with multiple rows (experiment with different numbers of rows)Upvotes: 0