Reputation: 150
I found a strange memory leak in my application and I don't know why it is happening but it drives me crazy. See the example below:
class Program
{
static void Main(string[] args)
{
var dataBuilder = new Program();
var dataBuilderWeakReference = new WeakReference(dataBuilder);
/* tons of data being processed */
dataBuilder = null;
GC.Collect();
GC.WaitForPendingFinalizers();
if(dataBuilderWeakReference.IsAlive)
{
throw new Exception("Why?");
}
}
}
This should work, am I right? The real problem is that the code runs inside a loop and it ends up with OOM exception:
class Program
{
static void Main(string[] args)
{
var dataBuilder = new Program();
var dataBuilderWeakReference = new WeakReference(dataBuilder);
while(true /* until there are some data */)
{
/* add record to data builder (tons of data) */
if(true /* data builder is full? */)
{
// create new data builder to save data into it
dataBuilder = null;
GC.Collect();
GC.WaitForPendingFinalizers();
if(dataBuilderWeakReference.IsAlive)
{
throw new Exception("Why?");
}
dataBuilder = new Program();
}
}
}
}
I tried profiling the code but the profiler tells me it's held by the main function.
In the example Program class doesn't have any member (for the sake of simplicity), but in the real application it holds references to tons of small (managed) objects and that's where I end up with OOM exception.
Thank you for any kind of help.
Aram Kocharyan suggested running code without 'Prefer 32-bit' option. Shocking result is, that my exception is not thrown i.e. memory is released as expected. BTW: My system is Windows 7 Professional, .NET 4.5, everything up to date.
Sadly my real application uses some mixed assemblies and has to be compiled as X86, thus 'Prefer 32-bit' option is not allowed for me.
OK, I did some tests and my application works if I launch the release build outside Visual Studio. I can see in Task Manager that memory is released as expected. However, It's a mystery to me, why GC behaves so differently in Debug or when Visual Studio is attached. I would expect some problems in release because of some optimizations, but not in debug. Maybe the debugger keeps references to objects longer.
There has to be some logical explanation for this.
Upvotes: 1
Views: 248
Reputation: 47540
One thing I would suggest is to redesigning your application to reuse the same Program instance by Resetting/Clearing it.
Microsoft's Large Object Heap Improvements in .NET 4.5 article suggests to use Object Pool Pattern for such cases.
Even though WeakReference is an handy option, it's not quite suitable for large objects. .Net allow breaks downs its application memory heap into 4 different chuncks. Small Object Heap (Generation 0, 1, 2) and Large Object Heap. If SOH or LOH runs out of memory it throws out of memory exception. LOH is quite tricky as it can be badly defragmented when you add/remove objects very often.
Even though GCLargeObjectHeapCompactionMode.CompactOnce option is available now, it's a very bad practice to call it iteratively in such a loop. This is a good option for a server to execute over night or so.
For more details on the dangers of LOH fragmentation,read “The dangers of the Large Object Heap” article.
Upvotes: 0
Reputation: 2604
It is possible that your object type of Program
is allocated in LOH
( Large Objects Heap ).
From MSDN:
The LOH is used for allocating memory for large objects (such as arrays) that require more than 85,000 bytes.
To Force GC to compact LOH when calling collect()
you need to set the GCSettings.LargeObjectHeapCompactionMode
property to CLargeObjectHeapCompactionMode.CompactOnce
like this:
GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect();
Without this flag GC
collects this object but will not compact LOH
and for result LOH
will be fragmented. ( some memmory parts will not be usable )
You can read more about it here.
Note that this property is available from .NET Framework 4.5.1
.
Upvotes: 1