Reputation: 71
I am taking 100+ tables from a SQL Database and writing all the data into 1 file. In SQL, I ran a complex query to format the tables into one view. In VS, I am connecting to SQL and selecting * from the view and using String Builder to build the file.
This works if I only grab 5% of the data and crashes with a memory exception if i try to grab 100%. The view has 10 million+ records. I am looking for advice. Thanks.
string cs4 = ConfigurationManager.ConnectionStrings["example"].ConnectionString;
StringBuilder sb4 = new StringBuilder();
using (SqlConnection con4 = new SqlConnection(cs4))
{
string strTQuery = @"SELECT * FROM [dbo].[view]";
SqlDataAdapter da4 = new SqlDataAdapter(strTQuery, con4);
DataSet ds4 = new DataSet();
da4.Fill(ds4);
ds4.Tables[0].TableName = "Example";
foreach (DataRow exDR in ds4.Tables["Example"].Rows)
{
sb4.Append(exDR["A"].ToString() + strDelimiter);
sb4.Append(exDR["B"].ToString() + strDelimiter);
sb4.Append(exDR["C"].ToString() + strDelimiter);
sb4.Append(exDR["D"].ToString());
sb4.Append("\r\n");
}
}
StreamWriter file4 = new StreamWriter(@"\\Desktop" + todaysDate + ".csv");
file4.WriteLine(sb4.ToString());
file4.Close();
Upvotes: 5
Views: 2206
Reputation: 62213
As mentioned by D Stanley
you need to write at the same time you are reading so everything goes right to disk and does not go to memory first and then disk. Something like this would work.
string cs4 = ConfigurationManager.ConnectionStrings["example"].ConnectionString;
using (SqlConnection con4 = new SqlConnection(cs4))
using(StreamWriter file4 = new StreamWriter(@"\\Desktop" + todaysDate + ".csv"))
{
string strTQuery = @"SELECT A, B, C, D FROM [dbo].[view]";
var command = new SqlCommand(strTQuery, con4);
con4.Open();
var reader = command.ExecuteReader();
while(reader.Read())
{
file4.Write(reader.GetString(0) + strDelimiter);
file4.Write(reader.GetString(1) + strDelimiter);
file4.Write(reader.GetString(2) + strDelimiter);
file4.Write(reader.GetString(3));
file4.Write("\r\n");
}
}
You should also use a SqlDataReader instead of a DataSet as this is a forward moving pointer and retrieves results with each iteration.
Upvotes: 7
Reputation: 98
You're trying to store the whole file in the runtime memory, that's why you're getting the memory error. Try saving the chunks of file instead and ammending the file.
try
string cs4 = ConfigurationManager.ConnectionStrings["example"].ConnectionString;
StringBuilder sb4 = new StringBuilder();
using (SqlConnection con4 = new SqlConnection(cs4))
{
StreamWriter file4 = new StreamWriter(@"\\Desktop" + todaysDate + ".csv", true);
string strTQuery = @"SELECT * FROM [dbo].[view]";
SqlDataAdapter da4 = new SqlDataAdapter(strTQuery, con4);
DataSet ds4 = new DataSet();
da4.Fill(ds4);
ds4.Tables[0].TableName = "Example";
foreach (DataRow exDR in ds4.Tables["Example"].Rows)
{
sb4 = new StringBuilder();
sb4.Append(exDR["A"].ToString() + strDelimiter);
sb4.Append(exDR["B"].ToString() + strDelimiter);
sb4.Append(exDR["C"].ToString() + strDelimiter);
sb4.Append(exDR["D"].ToString());
sb4.Append("\r\n");
file4.WriteLine(sb4.ToString());
}
file4.Close();
}
Upvotes: 0
Reputation: 7526
The problem is in this line:
file4.WriteLine(sb4.ToString());
The reason is no object in CLR can be greater than 2GB. Your string in this case is exceptionaly big and such behavior is not even needed (ToString I mean). To avoid this problem use FileStream to write your data as you receive it.
Upvotes: -1