jheppinstall
jheppinstall

Reputation: 2358

Finding how much memory I can allocate for an array in C#

I am doing some calculations that require a large array to be initialized. The maximum size of the array determines the maximum size of the problem I can solve.

Is there a way to programmatically determine how much memory is available for say, the biggest array of bytes possible?

Thanks

Upvotes: 4

Views: 7262

Answers (6)

user113476
user113476

Reputation:

The biggest array one can allocate in a 64 bit .NET program is 2GB. (Another reference.)

You could find out how many available bytes there are easily enough:


Using pc As New System.Diagnostics.PerformanceCounter("Memory", "Available Bytes")
  FreeBytes = pc.NextValue();
End Using

Given that information you should be able to make your decision.

Upvotes: 0

Alex
Alex

Reputation: 1

I suppose binary search could be a way to go. First, start by allocating 1 byte, if that succeeds, free that byte (set the object to null) and double it to 2 bytes. Go on until you can't allocate any more, and you have found a limit that you can consider "the lower limit".

The correct number of bytes that can be allocated (let's call it x) is within the interval lower < x < 2 * lower. Continue searching this interval using binary search.

Upvotes: 0

David Kirkland
David Kirkland

Reputation: 2461

In order to ensure you have enough free memory you could use a MemoryFailPoint. If the memory cannot be allocated, then an InsufficientMemoryException will be generated, which you can catch and deal with in an appropriate way.

Upvotes: 3

Jonathan Allen
Jonathan Allen

Reputation: 70297

If you need really, really big arrays, don't use the CLR. Mono supports 64-bit array indexes, allowing you to fully take advantage of your memory resources.

Upvotes: 1

Foredecker
Foredecker

Reputation: 7493

The short answer is "no". There are two top level resources that would need to be queried

  1. The largest block of unallocated virtual address space available to the process
  2. The amount of available page file space.

As Marc Gravell correctly stated, you will have your best success on a 64-bit platform. Here, each process has a huge virtual address space. This will effectively solve your first problem. You should also make sure the page file is large.

But, there is a better way that is limited only by the free space on your disk: memory mapped files. You can create a large mapping (say 512MB) into an arbitrarily large file and move it as you process your data. Note, be sure to open it for exclusive access.

Upvotes: 2

Marc Gravell
Marc Gravell

Reputation: 1062550

Well, relying on a single huge array has a range of associated issues - memory fragmentation, contiguous blocks, the limit on the maximum object size, etc. If you need a lot of data, I would recommend creating a class that simulates a large array using lots of smaller (but still large) arrays, each of fixed size - i.e. the indexer divides to to find the appropriate array, then uses % to get the offset inside that array.

You might also want to ensure you are on a 64-bit OS, with lots of memory. This will give you the maximum available head-room.

Depending on the scenario, more sophisticated algorithms such as sparse arrays, eta-vectors, etc might be of use to maximise what you can do. You might be amazed what people could do years ago with limited memory, and just a tape spinning backwards and forwards...

Upvotes: 10

Related Questions