Reputation: 1364
I've wrapped a SQLDataReader as an IEnumberable using a yield statement. I'd like use this to dump to a file. I'm seeing some pretty heavy memory utilization. Was wondering if anyone had any ideas on how to do this with minimal or set memory utilization. I don't mind specifying a buffer I'd just like to know what it'll be before I unleash this on an unsuspecting server.
I've been using something like the following:
class Program
{
static void Main(string[] args)
{
var fs = File.Create("c:\\somefile.txt");
var sw = new StreamWriter(fs);
foreach (var asdf in Enumerable.Range(0, 500000000))
{
sw.WriteLine("adsfadsf");//Data from Reader
}
sw.Close();
}
}
string commandText = @"SELECT name FROM {0} WHERE name NOT LIKE '%@%'";
SqlCommand sqlCommand = new SqlCommand(string.Format(commandText, list.TableName.SQLEncapsulate()),
_connection);
using (SqlDataReader sqlDataReader = sqlCommand.ExecuteReader())
{
while (sqlDataReader.Read())
{
yield return sqlDataReader["name"].ToString();
}
}
Upvotes: 0
Views: 1586
Reputation: 700222
Some heavy memory throughput is not a problem, and is unavoidable when you process a lot of data.
The data that you read will be allocated as new objects on the heap, but they are short lived objects as you just read the data, write it, then throw it away.
The memory management in .NET doesn't try to keep the memory usage as low as possible, as having a lot of unused memory doesn't make the computer faster. When you create and release objects, they will just be abandoned on the heap for a while, and the garbage collector cleans them up eventually.
It's normal for a .NET application to use a lot of memory when you are doing some heavy data processing, and the memory usage will drop again after a while. If there is some other part of the system that needs the memory, the garbage collector will do a more aggressive collection to free up as much as possible.
Upvotes: 1