Vitaly.S
Vitaly.S

Reputation: 19

System.OutOfMemoryException while working with large Lists

I have a code:

this.weights_StoA = new List<List<double>>();

if (NETWORK_MODE == 0)
{
    Random rand = new Random();

    int count = enters.Count;

    Parallel.For(0,  HIDDEN_NEURONS_COUNT, (i, loopState) =>
    {
        List<double> weights = new List<double>();

        for (int j = 0; j < count; j++)
        {
            weights.Add(rand.NextDouble());
        }

        lock (weights_StoA)
        {
            weights_StoA.Add(weights);
        }
    });
}

weights_StoA is a List<List<double>>.

I working with large arrays. HIDDEN_NEURONS_COUNT = 63480, entres.Conut = 126960. This code throws System.OutOfMemoryException. I tried to change architecture to x64 but it still throws the same exception.

How do I can fix this? I will be very grateful if you help me to solve this problem!

Upvotes: 0

Views: 3123

Answers (2)

Eugene
Eugene

Reputation: 2918

The .Net garbagge collector does not compacts large objects to avoid performace impact. Thus, you have 2 options:

  1. Allocate once the array for large data.

  2. Periodically set the value of the property GCSettings.LargeObjectHeapCompactionMode to GCLargeObjectHeapCompactionMode.CompactOnce. The next GC invocation will deal with the large objects, and the will reset to default value. See https://msdn.microsoft.com/en-us/library/system.runtime.gcsettings.largeobjectheapcompactionmode(v=vs.110).aspx

Upvotes: 0

Dai
Dai

Reputation: 155523

Disregarding the fact your program needs over 100GB of RAM to operate, if you know the size of a list beforehand then either preallocate it or use a fixed-size array: this avoids dynamic resizing and reallocations:

List<double> weights = new List<double>( count );
for( int j = 0; j < count; j++ )
{
     weights.Add( rand.NextDouble() );
}

or:

double[] weights = new double[count];
for( int j = 0; j < count; j++ )
{
     weights[j] = rand.NextDouble();
}

Upvotes: 2

Related Questions