Reputation: 1078
I have a data source with 1.4+ millions rows in it, and growing.
We make the users add filters to cut the called data down, but you are still looking at 43,000+/- to 100,000 +/- rows at a time.
Before any one says, no one can look at that many rows anyway, they are exported to a excel workbook for calculations based on them.
I am loading the result as follows in the GridView from the CSV file that is returned:
Object result = URIService.data;
CSVReader csvReader = new CSVReader(result);
DataTable dataTable = csvReader.CreateDataTable(true, true);
If(dataTable != null)
{
gridView1.BeginUpdate();
gridView1.DataSource = dataTable;
gridView1.DataBind()
gridView1.EndUpdate();
}
Else
{
Return;
}
CSVReader is a CSV Parser.
My question is, is this the best and most efficient way to load a large data set to a gridview?
EDIT: Would using a list for the rows or something other than a data table be better?
Upvotes: 2
Views: 11450
Reputation: 1
in the case of SQL SERVER use SqlBulkCopy class to copy large data with the highest speed
Upvotes: 0
Reputation: 13599
I think there is only one way to load the large data set into grid-view and it is the one you are using right now, but if you want to make the performance better I highly recommend using pagination so you have chunks of data loaded on every page therefore you will decrease the loading time
http://sivanandareddyg.blogspot.com/2011/11/efficient-server-side-paging-with.html
http://www.codeproject.com/Articles/125541/Effective-Paging-with-GridView-Control-in-ASP-NET
https://web.archive.org/web/20211020140032/https://www.4guysfromrolla.com/articles/031506-1.aspx
Upvotes: 3