Reputation: 4477
We have various tests that monitor our application's (or sub components) memory usage under various scenarios, to check for regressions. The problem is that our measurements (using Process.Refresh(); Process.PrivateMemorySize64
) fluctuate wildly over different runs.
What we are currently doing is polling via a background thread every X
milliseconds. We compare the max value reached to a benchmark, and pass/fail accordingly.
Interestingly, reducing the time between polls significantly reduces the maximum memory value recorded. In one example, reading the memory every 100ms gives a maximum of 360MB, every 10ms gives 147MB, and every 1ms gives 35ms. Presumably the increased number of observations somehow makes GC behaviour more aggressive.
I suppose that the main problem is that what we are trying to measure, "maximum memory usage", is not particularly well defined. If there's plenty of memory available, the GC might not bother to kick in, so memory usage will appear higher.
Is there a standard way to measure memory usage under dot net to guard against performance degradation?
Upvotes: 1
Views: 342
Reputation: 99859
I suppose that the main problem is that what we are trying to measure, "maximum memory usage", is not particularly well defined. If there's plenty of memory available, the GC might not bother to kick in, so memory usage will appear higher.
This is your problem. It's not arbitrary that a garbage-collected runtime would use more memory when it's available. These algorithms perform substantially better when they have a larger memory space to "play around in". For an objective analysis, you must use a tool to analyze only the reachable set, which is not affected by garbage collected behavior.
Upvotes: 1
Reputation: 28970
You can use PerformanceCounter class
to measure your memomry usage
link : http://msdn.microsoft.com/en-us/library/system.diagnostics.performancecounter.aspx
manualy you can use Memory Profiler tool
to inspect your memory
link : http://memprofiler.com/
Upvotes: 0