Manuel
Manuel

Reputation: 2203

Why does a .net program use more VirtualMemorySize64 when the computer has more ram

I've created a simple test application which allocates 100mb using binary arrays.

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Text;

namespace VirtualMemoryUsage
{
class Program
{
    static void Main(string[] args)
    {
        StringBuilder sb = new StringBuilder();
        sb.AppendLine($"IsServerGC = {System.Runtime.GCSettings.IsServerGC.ToString()}");

        const int tenMegabyte = 1024 * 1024 * 10;
        long allocatedMemory = 0;
        List<byte[]> memory = new List<byte[]>();
        for (int i = 0; i < 10; i++)
        {
            //alloc 10 mb memory
            memory.Add(new byte[tenMegabyte]);
            allocatedMemory += tenMegabyte;
        }

        sb.AppendLine($"Allocated memory:    {PrettifyByte(allocatedMemory)}");
        sb.AppendLine($"VirtualMemorySize64: {PrettifyByte(Process.GetCurrentProcess().VirtualMemorySize64)}");
        sb.AppendLine($"PrivateMemorySize64: {PrettifyByte(Process.GetCurrentProcess().PrivateMemorySize64)}");
        sb.AppendLine();
        Console.WriteLine(sb.ToString());
        Console.ReadLine();
    }

    private static object PrettifyByte(long allocatedMemory)
    {
        string[] sizes = { "B", "KB", "MB", "GB", "TB" };
        int order = 0;
        while (allocatedMemory >= 1024 && order < sizes.Length - 1)
        {
            order++;
            allocatedMemory = allocatedMemory / 1024;
        }
        return $"{allocatedMemory:0.##} {sizes[order]}";
    }
 }
}

Note: For this test it is important to set gcserver to true in the app.config

<runtime>
  <gcServer enabled="true"/>  
</runtime>

This then will show the amount of PrivateMemorySize64 and VirtualMemorySize64 allocated by the process.

While PrivateMemorySize64 remains similar on different computers, VirtualMemorySize64 varies quite a bit. enter image description here

What is the reason for this differences in VirtualMemorySize64 when the same amount of memory is allocated? Is there any documentation about this?

Upvotes: 4

Views: 639

Answers (2)

isp-zax
isp-zax

Reputation: 3873

The metrics you are using is not allocated memory, but memory used by the process. One - private, another - shared with other processes on your machine. Real amount of memory, used by the process varies depending on both amount of available memory and other processes running.

Edit: answer by Thomas Weller provides much more details on that subject than my Microsoft links

It does not necessarily represent the amount of allocations performed by your application. If you want to get estimate of the allocated memory (not including .NET framework libraries and memory pagination overhead, etc) you can use

long memory = GC.GetTotalMemory(true);

where true parameter tells GC to perform garbage collection first (it doesn't have to). Unused, but not collected memory is accounted for in the values you asked about. If system has enough memory, it might not be collected until it's needed. Here you can find additional information on how GC works.

Upvotes: 0

Thomas Weller
Thomas Weller

Reputation: 59279

Wow, you're lucky. On my machine, the last line says 17 GB!

Allocated memory:    100M
VirtualMemorySize64: 17679M
PrivateMemorySize64: 302M

While PrivateMemorySize64 remains similar on different computers [...]

Private bytes are the bytes that belong to your program only. It can hardly be influenced by something else. It contains what is on your heap and inaccessible by someone else.

Why is that 302 MB and not just 100 MB? SysInternals VMMap is a good tool to break down that value:

VMMap about private bytes

The colors and sizes of private bytes say:

  • violet (7.5 MB): image files, i.e. DLLs that are not shareable
  • orange (11.2 MB): heap (non-.NET)
  • green (103 MB): managed heap
  • orange (464 kB): stack
  • yellow (161 MB): private data, e.g. TEB and PEB
  • brown (36 MB): page table

As you can see, .NET has just 3 MB overhead in the managed heap. The rest is other stuff that needs to be done for any process.

A debugger or a profiler can help in breaking down the managed heap:

0:013> .loadby sos clr
0:013> !dumpheap -stat
[...]
000007fedac16878      258        11370 System.String
000007fed9bafb38      243        11664 System.Diagnostics.ThreadInfo
000007fedac16ef0       34        38928 System.Object[]
000007fed9bac9c0      510       138720 System.Diagnostics.NtProcessInfoHelper+SystemProcessInformation
000007fedabcfa28        1       191712 System.Int64[]
0000000000a46290      305       736732      Free
000007fedac1bb20       13    104858425 System.Byte[]
Total 1679 objects

So you can see there are some strings and other objects that .NET needs "by default".

What is the reason for this differences in VirtualMemorySize64 when the same amount of memory is allocated?

0:013> !address -summary
[...]
--- State Summary ---------------- RgnCount ----------- Total Size -------- %ofBusy %ofTotal
MEM_FREE                                 58      7fb`adfae000 (   7.983 TB)           99.79%
MEM_RESERVE                              61        4`3d0fc000 (  16.954 GB)  98.11%    0.21%
MEM_COMMIT                              322        0`14f46000 ( 335.273 MB)   1.89%    0.00%

Only 335 MB are committed. That's memory that can actually be used. The 16.954 GB are just reserved. They cannot be used at the moment. They are neither in RAM nor on disk in the page file. Allocating reserved memory is super fast. I've seen that 17 GB value very often, especially in ASP.NET crash dumps.

Looking at details in VMMap again

VMMap managed heap details

we can see that the 17 GB are just allocated in one block. A comment on your question said: "When the system runs out of memory, the garbage collector fires and releases the busy one." However, to release a VirtualAlloc()'d block by VirtualFree(), that block must not be logically empty, i.e. there should not be a single .NET object inside - and that's unlikely. So it will stay there forever.

What are possible benefits? It's a single contiguous block of memory. If you need a new byte[4G]() now, it would just work.

Finally, the likely reason is: it's done because it doesn't hurt, neither RAM nor disk. And when needed, it can be commited at a later point in time.

Is there any documentation about this?

That's unlikely. The GC implementation in detail could change with the next version of .NET. I think Microsoft does not document that, otherwise people would complain if the behavior changed.

There are people who have written blog posts like this one that tells us that some values might depend on the number of processors for example.

0:013> !eeheap -gc
Number of GC Heaps: 4
[...]

What we see here is that .NET creates as many heaps as processors. That's good for gabage collection, since every processor can collect one heap independently.

Upvotes: 2

Related Questions