Reputation: 24685
I need to insert millions of records being read from disk into SQL server. I am parsing these from a file on one machine, in what is a single-threaded process.
Ideally I would want it to perform well if the SQL server is local or remote. It has to be done programatically in C#.
Upvotes: 3
Views: 3427
Reputation: 41
Depends on your input file format (1) it is suited for bulk copy, so, use it, (2) it is not suited for bulk copy or you need extra processing/checks/etc.. on the server-side ; so DO jump in with multi-threaded inserts loops, commit every 1000 row or more, and possibly use array inserts
Upvotes: 0
Reputation: 15325
Try bulk copy. You can use the SqlBulkCopy class to write a little app to read your data in memory and load it into SQL Server.
Upvotes: 1
Reputation: 294177
Fastest way is using SSIS with parallel reads, NUMA affinitizes clients, partitions writes and switch all partitions into a single table at the end. This will load you more than 2 TB per hour.
If you have a suitable text file then probably bulk copy utility.
If you want to insert from your process, then you can either use the SqlBulkCopy.WriteToServer but you have to present the data as an IDataReader, or you can use straigh SqlCommand inserts. With later, if you batch inserts commits, you'll achieve a good trhoughput. The usual bottleneck is the log flush on single statement commits.
Upvotes: 8
Reputation: 4794
The bcp utility is lightning fast and configurable. While it is a standalone utility, it runs from the command line and can certainly be summoned from c#.
Upvotes: 0