Reputation: 71
I have a rather simple method within a data model class that does nothing more than insert a single row and save the changes. It correctly processes about 227 records [out of 408] and then returns an OutofMemoryException. This is occurring even though we're only processing single row entries and saving the changes. Any ideas how to resolve this situation?
protected ADMIN_DB_Entities _AdminEntities = new ADMIN_DB_Entities();
public void InsertBase64Payload(string sBase64PayloadValue, string sSourceValue, Guid gSourcePrimaryKey, DateTime dDateProcessed)
{
Base64Payload newBase64PayLoadEntry = new Base64Payload();
newBase64PayLoadEntry.BASE64_VALUE = sBase64PayloadValue;
newBase64PayLoadEntry.SOURCE_TABLE = sSourceValue;
newBase64PayLoadEntry.SOURCE_PRIMARY_KEY = gSourcePrimaryKey;
newBase64PayLoadEntry.DATE_PROCESSED = dDateProcessed;
}
try
{
_AdminEntities.Base64Payload.Add(newBase64PayLoadEntry);
_AdminEntities.SaveChanges();
}
catch (Exception ex)
{
ConsoleAppLog.WriteLog(<Error Message here.>)
}
Upvotes: 2
Views: 126
Reputation: 3910
I presume you are working with very large base64 "payloads".
EntityFramework's DbContext
saves entity's state inside memory. So, even after you save changes into database, those values will be inside process memory. DbContext
implements IDisposable
interface, so in such scenarios it's better to dispose context after you saved desired data into database:
using (var entites = = new ADMIN_DB_Entities())
{
try
{
entities.Base64Payload.Add(newBase64PayLoadEntry);
entities.SaveChanges();
}
catch (Exception ex)
{
ConsoleAppLog.WriteLog(ex.Message);
}
}
Note: Keep in mind that there is mechanism in which you can de/attach particular entity from internal database context's tracking, so if needed, you can also work with the single DbContext
instance.
Upvotes: 2