intrigued_66
intrigued_66

Reputation: 17220

Ways to increase performance of DataTable.Load()?

I currently use a custom CSV class from Codeproject to create a CSV object. I then use this to populate a DataTable. Under profiling this is taking more time than I would like and I wonder if there is a more efficient way of doing it?

The CSV contains approximately 2,500 rows and 500 columns.

The CSV reader is from: http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader

StreamReader s = new StreamReader(confirmedFilePath);
CsvReader csv = new CsvReader(s, true);
DataTable dt = new DataTable();
dt.Load(csv);

I came across a google search suggesting a DataAdapter, but it was only one reference to this? I searched further but didn't find any collaboration.

Upvotes: 2

Views: 3196

Answers (3)

Nikola Bogdanović
Nikola Bogdanović

Reputation: 3213

Always use the BeginLoadData() and EndLoadData() when filling from databases, as they already enforce constraints by themselves - the only downside is that a CSV file obviously does not, so any exception is thrown only after the whole operation ends.

...
dt.BeginLoadData();
dt.Load(csv, LoadOption.Upsert);
dt.EndLoadData();

EDIT: Use the LoadOption.Upsert only if the DataBase is empty, or you don't want to preserve any previous changes to existing data - its even more faster that way.

Upvotes: 0

Anderson Pimentel
Anderson Pimentel

Reputation: 5757

Give GenericParser a try.

Upvotes: 0

pabdulin
pabdulin

Reputation: 35219

CsvReader is fast and reliable, I almost sure you can't find anything faster (if there is at all) for reading CSV data.

Limitation comes from DataTable processing new data, 2500*500 thats qiute of amount. I think fastest way would be direct CsvReader->DataBase (ADO.NET) chain.

Upvotes: 1

Related Questions