Laxminarayan Charan
Laxminarayan Charan

Reputation: 79

what is fastest way to read more than 3000 .CSV files Data from a location to Sql table through .net c#

Please assist. what is fastest way to read more than 3000 .CSV fil moes Data from a location to Sql table through .net c#

I am using Ado.net .
size of each file is around 120 kb file contains , separated data

Please assist me if any have idea about this

Upvotes: 1

Views: 190

Answers (2)

pkuderov
pkuderov

Reputation: 3551

  1. Separate problems: reading files and pushing data to DB are different concerns,
  2. Load files into memory buffer as batches of e.g. no more than 200k total rows,
  3. Use SqlBulkCopy class to insert batches of rows from your buffer into the table (by batches of e.g. 20k rows),
  4. Use async to do 2. and 3. jobs asyncronously and as parallel - read .csv files when pushing rows to DB, because it uses different resources (network and file system).

Upvotes: 0

Pedro G. Dias
Pedro G. Dias

Reputation: 3222

reading the files can be done in Parallel, something like this:

var files = someFolder.GetFiles("?.csv"); // Get all csv files
Parallel.Foreach(files, file => {

  // insert contents into table
});

This is however only half the answer. Some also need to tell you the optimal way to do the inserts into the SQL database. You aren't stating if you're using ADO or EntityFramework or some other mechanism to connect to the server, which is kind of essential to speed and for us to give you a good example.

Upvotes: 4

Related Questions