Reputation: 10354
I am exporting a large DataTable to an Excel sheet (.xls) format. I call stored procedure here, which returns the DataTable. The stored procedure used to return 7,50,000 rows with 93 columns.
In back end, if I pass the parameters for the stored procedure, it takes 8 mins to complete the process, since it is a long process.
But when I call the procedure from the front end, while assigning the records to the DataTable, "System Out Of Memory" exception occurring.
So When I google it, I came to know it is because the size exceeds the server RAM size, so it throws this exception. When I export small DataTable it works fine.
What is the alternate way for this? Anyone suggest me. Below is my code,
C#:
DataTable dt = BindDatatable();
Response.ClearContent();
Response.Buffer = true;
Response.AddHeader("content-disposition", string.Format("attachment; filename={0}", "MBQ_Auto.xls"));
Response.ContentType = "application/ms-excel";
//Response.ContentType = "application/octet-stream";
//Response.ContentType = "text/tab-separated-values";
string str = string.Empty;
foreach (DataColumn dtcol in dt.Columns)
{
Response.Write(str + dtcol.ColumnName);
str = "\t";
}
Response.Write("\n");
int rowCount = 0;
foreach (DataRow dr in dt.Rows)
{
str = "";
for (int j = 0; j < dt.Columns.Count; j++)
{
Response.Write(str + Convert.ToString(dr[j]));
str = "\t";
}
Response.Write("\n");
if (unchecked(++rowCount % 1024 == 0))
Response.Flush();
}
Response.End();
Upvotes: 0
Views: 3514
Reputation: 415
You should not use DataTable for this kind of problem. DataTable stores the entire data in memory.
You should use the DataReader to fetch rows from database.
Upvotes: 1