Jeeva
Jeeva

Reputation: 4663

Out of memory exception when pulling huge data from DB

We are pulling a huge data from sql server DB. It has around 25000 rows with 2500 columns. The requirement is to read the data and export it to spread sheet, so pagination is not a choice. When the records are less it is able to pull the data but when it grows to the size i mentioned above it is throwing exception.

public DataSet Exportexcel(string Username)
{
    Database db = DatabaseFactory.CreateDatabase(Config);
    DbCommand dbCommand = 
        db.GetStoredProcCommand("Sp_ExportADExcel");
    db.AddInParameter(dbCommand, "@Username", DbType.String, 
        Username);
    return db.ExecuteDataSet(dbCommand);
}

Please help me in resolving this issue.

Upvotes: 3

Views: 7208

Answers (4)

ChrisBD
ChrisBD

Reputation: 9209

Your problem is purely down to the fact that you are trying to extract so much data in one go. You may get around the problem by installing more memory in the machine doing the query, but this is just a bodge.

Your best to retrieve such amounts of data in steps.

You could quite easily read the data back row by row and export/append that in CSV format to a file and this could all be done in a stored procedure.

You don't say what database you are using, but handling such large amounts of data is what database engines are designed to cope with.

Other than that when handling large quantities of data objects in C# code its best to look into using generics as this doesn't impose object instantiation in the same way that classes do and so reduces the memory footprint.

Upvotes: 1

Mathias F
Mathias F

Reputation: 15891

If it is enough to have the data available in Excel as csv you can use bulk copy

bcp "select col1, col2, col3 from database.schema.SomeTable" queryout  "c:\MyData.txt"  -c -t"," -r"\n" -S ServerName -T

This is mangnitudes faster and has little footprint.

Upvotes: 0

RSP
RSP

Reputation: 231

You can use batch processing logic to fetch records in batches say 5000 records per execution and store the result in a temp dataset and once all processing is done. Dump the data from temp dataset to excel.

You can use C# BulkCopy class for this purpose.

Upvotes: 0

Haris Hasan
Haris Hasan

Reputation: 30097

The requirement is to read the data and export it to spread sheet, so pagination is not a choice.

Why not read data in steps. Instead of getting all records at once why not get limited number of records every time and write them to excel. Continue until you have processed all the records

Upvotes: 4

Related Questions