Santosh
Santosh

Reputation: 660

Making sense of Performance Profiler output for a leaky application?

I am looking for memory and resource leaks in a .NET project since the application is eventually crashing with some general GDI+ exceptions and sometimes OOM errors. The library I am trying to analyse uses System.Graphics APIs but doesn't directly draw to the GUI. I am using Visual Studio 2019 Enterprise edition and the built-in Performance Profiler tool to analyze memory usage. The tool is setup to capture both managed and native heap usage. I have taken two snapshots. The first is a baseline when the library isn't yet invoked. The second snapshot is after calling the library method repeatedly in a loop and finally calling GC.Collect() and GC.WaitForPendingFinalizers(). Shown below are the screenshots of the memory usage graphs. The first shows the Managed Heap and the second shows the Native Heap.

Profiler output showing Managed Heap Profiler output showing Managed Heap

Profiler output showing Native Heap Profiler output showing Native Heap

Process Explorer entry after the alleged leak Process Explorer entry after the alleged leak

Analysis of memory dump in WinDbg

--- Usage Summary ---------------- RgnCount ----------- Total Size -------- %ofBusy %ofTotal
Free                                    891     7ffd`bb247000 ( 127.991 TB)           99.99%
<unknown>                               916        2`3b96d000 (   8.931 GB)  98.40%    0.01%
Image                                   471        0`0837e000 ( 131.492 MB)   1.41%    0.00%
Heap                                     35        0`007e4000 (   7.891 MB)   0.08%    0.00%
Stack                                    21        0`00700000 (   7.000 MB)   0.08%    0.00%
Other                                     8        0`001cb000 (   1.793 MB)   0.02%    0.00%
TEB                                       7        0`0000e000 (  56.000 kB)   0.00%    0.00%
PEB                                       1        0`00001000 (   4.000 kB)   0.00%    0.00%

--- Type Summary (for busy) ------ RgnCount ----------- Total Size -------- %ofBusy %ofTotal
MEM_PRIVATE                             944        2`395e9000 (   8.896 GB)  98.02%    0.01%
MEM_IMAGE                               491        0`08a1a000 ( 138.102 MB)   1.49%    0.00%
MEM_MAPPED                               24        0`02da6000 (  45.648 MB)   0.49%    0.00%

--- State Summary ---------------- RgnCount ----------- Total Size -------- %ofBusy %ofTotal
MEM_FREE                                891     7ffd`bb247000 ( 127.991 TB)           99.99%
MEM_COMMIT                             1379        1`287f4000 (   4.633 GB)  51.05%    0.00%
MEM_RESERVE                              80        1`1c5b5000 (   4.443 GB)  48.95%    0.00%

--- Protect Summary (for commit) - RgnCount ----------- Total Size -------- %ofBusy %ofTotal
PAGE_READWRITE                         1010        1`1e71f000 (   4.476 GB)  49.31%    0.00%
PAGE_EXECUTE_READ                        52        0`05ec3000 (  94.762 MB)   1.02%    0.00%
PAGE_READONLY                           180        0`032ca000 (  50.789 MB)   0.55%    0.00%
PAGE_WRITECOPY                          113        0`00ed6000 (  14.836 MB)   0.16%    0.00%
PAGE_EXECUTE_READWRITE                   17        0`00051000 ( 324.000 kB)   0.00%    0.00%
PAGE_READWRITE|PAGE_GUARD                 7        0`00021000 ( 132.000 kB)   0.00%    0.00%

--- Largest Region by Usage ----------- Base Address -------- Region Size ----------
Free                                      1`3dbe9000     7ff2`c85d7000 ( 127.948 TB)
<unknown>                              7ff4`06370000        1`00020000 (   4.000 GB)
Image                                  7ffd`63eba000        0`0103d000 (  16.238 MB)
Heap                                      0`1ded2000        0`0017d000 (   1.488 MB)
Stack                                     0`1bac0000        0`000fa000 (1000.000 kB)
Other                                     0`017d0000        0`00181000 (   1.504 MB)
TEB                                       0`00e0d000        0`00002000 (   8.000 kB)
PEB                                       0`00e0c000        0`00001000 (   4.000 kB)

What is puzzling to me that the Memory Usage (Private Bytes) as per the graph keeps growing (up to 5 GB) but the Snapshot #2 shows only about 2.55 MB native heap and about 316 KB of managed heap usage. The library function is actually being called about 400 times just to exacerbate the problem. The Private Bytes usage is proportional to the loop count. Though after each iteration the object being returned by the library method is being disposed the memory usage never reaches a steady state and keeps increasing if the loop count is increased. This and the eventual GDI+ exceptions suggests to me that the library is leaking handles or memory but the profiler output doesn't seem to indicate this. I'd be glad if someone could thrown some light on this and help me understand what I am seeing here.

Upvotes: 3

Views: 177

Answers (1)

Santosh
Santosh

Reputation: 660

As @ThomasWeller pointed out it turned out to be a leak of image data. Thought I'd post my finding here as an answer in the hope that it may help someone.

After spending sometime with WinDbg I still couldn't find anything that helped me solve this leak which was perhaps due to my lack of expertise in using the tool. I then came across a suggestion to use the VMMap utility from SysInternals. I found this much easier to use though obviously not as powerful as WinDbg. Shown below is the screenshot of VMMap after the leak has occurred.

VMMap view of the memory usage

As can be seen the heap usage is only about 8 MB while the private data is about 9 GB. Clicking on the Private Data entry revealed hundreds of same-sized allocation entries. Given the nature of the library this pointed to it being some kind of image data. My first suspicion was Bitmap objects not being disposed but looking at every usage of Bitmap in the code didn't turn up anything. However while looking at this I found a call to Bitmap.LockBits which didn't have a corresponding UnlockBits call. This turned out to be the cause of the leak.

Upvotes: 3

Related Questions