Reputation: 4155
I have an ADO.Net Data Access layer in my application that uses basic ADO.Net coupled with CRUD stored procedures (one per operation e.g. Select_myTable, Insert_myTable). As you can imagine, in a large system (like ours), the number of DB objects required by the DA layer is pretty large.
I've been looking at the possibility of refactoring the layer classes into EF POCO classes. I've managed to do this, but when I try to performance test, it gets pretty horrific. Using the class below (create object, set Key to desired value, call dataselect), 100000 runs of data loading only takes about 47 seconds (there are only a handful of records in the DB). Whereas the Stored Proc method takes about 7 seconds.
I'm looking for advice on how to optimise this - as a point of note, I cannot change the exposed functionality of the layer - only how it implements the methods (i.e. I can't pass responsibility for context ownership to the BO layer)
Thanks
public class DAContext : DbContext
{
public DAContext(DbConnection connection, DbTransaction trans)
: base(connection, false)
{
this.Database.UseTransaction(trans);
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
//Stop Pluralising the Object names for table names.
modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
//Set any property ending in "Key" as a key type.
modelBuilder.Properties().Where(prop => prop.Name.ToLower().EndsWith("key")).Configure(config => config.IsKey());
}
public DbSet<MyTable> MyTable{ get; set; }
}
public class MyTable : DataAccessBase
{
#region Properties
public int MyTableKey { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public bool Active { get; set; }
public int CreatedBy { get; set; }
public DateTime CreatedDate { get; set; }
public int ModifiedBy { get; set; }
public DateTime ModifiedDate { get; set; }
#endregion
#region constructors
public MyTable()
{
//Set Default Values.
Active = true;
Name = string.Empty;
CreatedDate = DateTime.MinValue;
ModifiedDate = DateTime.MinValue;
}
#endregion
#region Methods
public override void DataSelect(System.Data.SqlClient.SqlConnection connection, System.Data.SqlClient.SqlTransaction transaction)
{
using (DAContext ctxt = new DAContext(connection, transaction))
{
var limitquery = from C in ctxt.MyTable
select C;
//TODO: Sort the Query
limitquery = FilterQuery(limitquery);
var limit = limitquery.FirstOrDefault();
if (limit != null)
{
this.Name = limit.Name;
this.Description = limit.Description;
this.Active = limit.Active;
this.CreatedBy = limit.CreatedBy;
this.CreatedDate = limit.CreatedDate;
this.ModifiedBy = limit.ModifiedBy;
this.ModifiedDate = limit.ModifiedDate;
}
else
{
throw new ObjectNotFoundException(string.Format("No MyTable with the specified Key ({0}) exists", this.MyTableKey));
}
}
}
private IQueryable<MyTable1> FilterQuery(IQueryable<MyTable1> limitQuery)
{
if (MyTableKey > 0) limitQuery = limitQuery.Where(C => C.MyTableKey == MyTableKey);
if (!string.IsNullOrEmpty(Name)) limitQuery = limitQuery.Where(C => C.Name == Name);
if (!string.IsNullOrEmpty(Description)) limitQuery = limitQuery.Where(C => C.Description == Description);
if (Active) limitQuery = limitQuery.Where(C => C.Active == true);
if (CreatedBy > 0) limitQuery = limitQuery.Where(C => C.CreatedBy == CreatedBy);
if (ModifiedBy > 0) limitQuery = limitQuery.Where(C => C.ModifiedBy == ModifiedBy);
if (CreatedDate > DateTime.MinValue) limitQuery = limitQuery.Where(C => C.CreatedDate == CreatedDate);
if (ModifiedDate > DateTime.MinValue) limitQuery = limitQuery.Where(C => C.ModifiedDate == ModifiedDate);
return limitQuery;
}
#endregion
}
Upvotes: 0
Views: 6262
Reputation: 4797
This might be just a hunch, but ... In your stored procedure, the filters are well defined and the SP is in a compiled state with decent execution plan. Your EF query gets constructed from scratch and recompiled on every use. So the task now becomes to devise a way to compile and preserve your EF queries, between uses. One way would be to rewrite your FilterQuery to not rely on fluent conditional method chain. Instead of appending, or not, a new condition every time your parameter set changes, convert it into one, where the filter is either applied when condition is met, or overridden by something like 1.Equals(1) when not. This way your query can be complied and made available for re-use. The backing SQL will look funky, but execution times should improve. Alternatively you could devise Aspect Oriented Programming approach, where compiled queries would be re-used based on parameter values. If I will have the time, I will post a sample on Code Project.
Upvotes: 0
Reputation: 48314
Selects are slow with tracking on. You should definitely turn off tracking and measure again.
Take a look at my benchmarks
http://netpl.blogspot.com/2013/05/yet-another-orm-micro-benchmark-part-23_15.html
Upvotes: 1