Reputation: 225
I am working on a program where it reads a 312 MB encrypted file into memory stream , decrypts it and copies into destination stream. My program works well with file size of around 120 MB . I couldn't figure out why it is failing for this ?
My System info : 64 bit cpu , RAM : 128 GB Also the c# code I built on using Any CPU setting in Configuration Manager.
I wrote a sample program to check where I am getting out of memory and I see that its failing at 512 MB. I do know that the Memory stream requires contigous blocks in the Memory as the RAM is fragmented. But the RAM size is huge here, I tried in multiple machines as well with RAMs of 14 GB, 64GB & 8GB.
Any help is appreciated.
The sample program I wrote to test the Out of Memory Size :
const int bufferSize = 4096;
byte[] buffer = new byte[bufferSize];
int fileSize = 1000 * 1024 * 1024;
int total = 0;
try
{
using (MemoryStream memory = new MemoryStream())
{
while (total < fileSize)
{
memory.Write(buffer, 0, bufferSize);
total += bufferSize;
}
}
Console.WriteLine("No errors");
}
catch (OutOfMemoryException)
{
Console.WriteLine("OutOfMemory around size : " + (total / (1024m * 1024.0m)) + "MB");
}
Upvotes: 1
Views: 234
Reputation: 510
Try disabling the option "Prefer 32 bits" from the project's properties, in "Build" tab, thats works for me. Good luck!
Upvotes: 1
Reputation: 6212
Just running out of Large Object Heap I guess. However another approach to solving your problem is to not read the stream into memory - most decrypt algorithms just want a System.IO.Stream - reading it into memory seems a relatively pointless step - just pass the decrypt api your incoming file or network stream instead.
Upvotes: 3