James123
James123

Reputation: 11662

system.outofmemoryexception When filling DataAdapter?

I have to pull 150K records from DB. I am using da.Fill(ds,"Query") and its throwing system.outofmemoryexception.

Dim daGrid As New SqlDataAdapter(sqlcmd_q)
daGrid.Fill(dsGrid, "Query")
daGrid.Dispose()

I need this datatable only. I cannot use XML. because I need assign this to MSChartControl to display ScotterPlot.

Any suggestions?

Upvotes: 5

Views: 28676

Answers (3)

Brian Cryer
Brian Cryer

Reputation: 2224

I know that this answer is far to late to help the original poster, but my hope is that it might help others who hit a similar problem.

Firstly, it is the DataTable that is the problem not the DataAdapter.

The problem might be that you are genuinely out of memory (in which case my answer won't help). You can do the maths to work out whether that might be the case - number of records x guesstimate of bytes per record. If that is approaching 2GB on a 32 bit platform or your available RAM on a 64 bit platform then your only option is to reduce the number of records, number of fields or come up with an approach that uses a DataReader instead of a DataTable.

In your case you have 150k records, lets assume that each one requires 1KB of memory, that gives us a round figure of 150MB. Even on a 32 bit machine with 2GB of RAM that should be fine (provided there aren't many similar memory allocations going on). In your case you have a 64 bit machine with 128GB RAM (nice). So logically you shouldn't be getting out of memory errors.

So what is causing the problem? It is the Large Object Heap (LOH). Why? The DataTable creates an array to hold those records. My understanding is that it creates an array of 50 and then grows that as records are added. Any memory allocation over 85,000 bytes will come from the large object heap. (You were on a 64 bit platform, so that works out that once you hit 10,625 records then allocations will start coming from the Large Object Heap.) The trouble with the Large Object Heap is that it isn't compacted. So there might be lots of free space, but no single contiguous block that is large enough. With .net 4.5 Microsoft has improved it in terms of coalescing adjacent fragments, but it won't reorganize those to create larger blocks of free space. The net effect is that once you stray into the LOH in my experience it is only a matter of time before you get an Out of Memory exception.

The solution?

What worked for me was to set the initial capacity of the DataTable. When pulling records from the database this will mean doing a count first, so this does come at the expense of an extra database query, then:

.
.
dsGrid.InitialCapacity = count;
daGrid.Fill(dsGrid, "Query");
.
.

Whilst that doesn't avoid straying into the LOH, it should mean that it only does one allocation instead of multiple. So as well as avoiding the out of memory exception you should also get a performance gain (offset by the need for the extra database query).

You can make the .net garbage collector compact the large object heap, but you can only tell it to do it next time it runs. I tend to use this if I know that I'm straying into the large object heap. This might be overkill, but consider amending my above suggestion to:

.
.
dsGrid.InitialCapacity = count;
if (count > 10625)
{
    System.Runtime.GCSettings.LargeObjectHeapCompactionMode =
        System.Runtime.GCLargeObjectHeapCompactionMode.CompactOnce;
}
daGrid.Fill(dsGrid, "Query");
.
.

Upvotes: 4

TToni
TToni

Reputation: 9391

You didn't specify the query. Make sure it contains only the columns you need.

If you still have problems you can try to switch to 64 bit (if your hardware supports it and you have more than 2 GB of free memory).

If that doesn't help you have to reduce the memory footprint. A possible option would be to render the plot without storing all the base data in memory. Just load the data one by one, calculate the coordinates and store these without storing the underlying record. Maybe you can even let the query do that.

Upvotes: 4

Chris Shain
Chris Shain

Reputation: 51369

The first thing that I'd check is how many columns you are returning, and what their data types are. Although 150K records is a lot, it shouldn't give you an OOM exception unless each record is about 13K in length (on a 32-bit machine). This suggests to me that you are either returning way more fields than you need, or perhaps that some of the fields are very large strings or binary data. Try cutting down the select statement to only return the fields that are absolutely needed for the display.

If that doesn't work, you may need to move from a DataTable to a list of a custom data type (a class with the appropriate fields).

Upvotes: 7

Related Questions