Reputation: 3361
Is there a performant way to load very big CSV-files (which have size of several gigabytes) into an SQL-Server 2008 database with .NET?
Upvotes: 0
Views: 3152
Reputation: 6963
Use SqlBulkCopy http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
Upvotes: 0
Reputation: 1064204
I would combine this CSV reader with SqlBulkCopy
; i.e.
using (var file = new StreamReader(path))
using (var csv = new CsvReader(file, true)) // true = has header row
using (var bcp = new SqlBulkCopy(connection)) {
bcp.DestinationTableName = "TableName";
bcp.WriteToServer(csv);
}
This uses the bulk-copy API to do the inserts, while using a fully-managed (and fast) IDataReader
implementation (crucially, which streams the data, rather than loading it all at once).
Upvotes: 7