Reputation: 137
I am facing out of memory exception while executing decontext.savechanges() method.
Actually i need to insert/update 10000+ record into some tables.
ie
//insert table 1 // 10000 record <br>
//insert or update record to table 2 //25000 record<br>
//table 3 1000 record<br>
....
.....
.....
db.context.savechanges(); //Exception happening in this line.
I got "Exception of type 'System.OutOfMemoryException' was thrown" error.
I tried AutoDetectChangesEnabled but its did not help
mydbcontext.Configuration.AutoDetectChangesEnabled = false;
mydbcontext.SaveChanges();
Upvotes: 0
Views: 2120
Reputation: 362
context.BulkInsert(entitiesList);
For BulkInsert follow the link https://github.com/borisdj/EFCore.BulkExtensions
Another is sqlBulkCopy Fastest Way of Inserting in Entity Framework
If you want to use the Savechanges outsude the loop, so look the below code
try
{
context.Configuration.AutoDetectChangesEnabled = false;
foreach (var book in bookList)
{
context.Books.Add(book);
}
context.SaveChanges();
}
finally
{
context.Configuration.AutoDetectChangesEnabled = true;
}
Upvotes: 0
Reputation: 46
"Exception of type 'System.OutOfMemoryException' was thrown" - This normally happens to me when I don't dispose my database. The issue probably isn't the amount of records you are trying to update, its probably because those records are kept in the System Memory.
Solution 1(Recommended) Wrap your database functions in 'using' statements which will auto dispose your database context and free up memory
public void CreateRange(List<MyModel> modelList){
using(DbContext db = new DbContext())
{
db.MyTable.AddRange(modelList);
db.SaveChanges();
}
}
Solution 2 Call 'db.Dispose();' before the end of each method
public void CreateRange(List<MyModel> modelList){
db.MyTable.AddRange(modelList);
db.SaveChanges();
db.Dispose();
}
Upvotes: 2