Reputation: 51
foreach (string path in mainFolders)
{
if (!Directory.Exists(path))
continue;
Stack<string> pathsToCheck = new Stack<string>(Directory.GetDirectories(path));
while (pathsToCheck.Count > 0)
{
try
{
string cPath = pathsToCheck.Pop();
string[] dir = Directory.GetDirectories(cPath);
if (dir.Length > 0)
foreach (string s in dir)
pathsToCheck.Push(s);
else
folderPaths.Add(cPath.Replace('\\', '/'));
}
catch (Exception e)
{
errors.Add(e.Message);
}
}
}
So basically what I'm trying to do is take a list of folder paths and get all of the subdirectories from them. Sometimes this seems to be causing memory usage upwards of 9GB (as said by Task Manager), and while there can be a ton of folders it's checking (most I've checked was 45,000 folders) that's still a relatively small amount of memory.
So is there something wrong I'm not catching that could be leaking that much memory? I'm doing it this way because just Directory.GetDirectories() fails as soon as it comes across a folder that it can't read. I'm using Unity and am stuck with .NET 2.0.
Upvotes: 0
Views: 439
Reputation: 9824
A pet peeve of mine is faulty exception handling. And yours can swallow Fatal Exceptions, wich is a deadly sin of exception handling. He are two articles I link often:
http://blogs.msdn.com/b/ericlippert/archive/2008/09/10/vexing-exceptions.aspx http://www.codeproject.com/Articles/9538/Exception-Handling-Best-Practices-in-NET
First, you can not do memory measurement with the Task Manager. The values are less the useless: http://www.itwriting.com/dotnetmem.php
Secondly before you try to debug a presumed memory leak, you need to understand how the GC works. In particular that it will try to avoid running until application closure if at all possible: https://social.msdn.microsoft.com/Forums/en-US/286d8c7f-87ca-46b9-9608-2b559d7dc79f/garbage-collection-pros-and-limits?forum=csharpgeneral
There are only a few possible Memory Leak scenarios in .NET:
Upvotes: 1