Chris Rice
Chris Rice

Reputation: 819

Best Practices for implementing Entity Framework in an existing application with a medium size database(~100 tables, ~300sp's)

I'm working on extending/converting an existing asp.net application to using EF4.0/5.0 and i can't seem to find a consensus online what the best practices for laying out the design. My current solution has 1Million+ lines of code and currently uses a custom data framework that is unwieldily. My database has around 100 tables, around 300 stored procedures, and some tables can have to up 10Million rows but only a fraction of them are used in entities and such. A typical page load recovers an entity from the database which is presented in a grid and then the user edits that grid and then saves it by first recovering the entity again and then saving it. Most of the data does not change that often but may be in use by multiple people at the same time. Stored procedures are used for paging operations on the table and such. As it is right now my DB server is almost overwhelmed and my web servers have almost no load on them.

So my question is in regards to how to best set up the new EF framework to match the needs of my programming going forward. So far i've seen the following methodologies:

  1. Create on EDMX that contains every entity / stored procedure you will need.
  2. Split up the EDMX so that each one has a specific subset of tables / stored procedures.

The benefits i see of #1 is that linking data between entities will be easy and entities can be cached effectively database wide. The cons is the cost of initially loading the model. Is the cost going to be that high? If i instantiate an entire model only to use one entity for like 3 - 5 queries will that be a big point of issue?

#2 seems to solve some of the problems of #1 but it has issues where i may have to instantiate 2-3 models to get the job done due to not wanting to repeat entities across models. Is this worth it?

In regards on how to use these models i've seen the following.

  1. Every time you need to use an entity you use it with the using block and save or discard the changes at the end of the block.
  2. Leave the model in session and instantiate it only once and reuse it when you need it.

I can see both of these having their benefits and issues. Mainly #1 means that the data will always be in sync with the db but caching will be almost be nonexistent. #2 allows the application to almost have an in memory copy of the db but has the potential to be out of sync.

The way i've been proceeding is to create separate models for each component and use and dispose them immediately with using blocks. I also have the models using the same namespace. a typical block of code goes like this.

Using(CommonType common = new CommonType()){
Using(mytype type = new mytyp()){
  // execute query on mytype
  // manipulate data using common type

} }

So what should i do? Am i completely off base here with my assumptions?

A couple resources i've been using to come up with this information: http://msdn.microsoft.com/en-us/data/hh949853 https://blogs.msdn.com/b/adonet/archive/2008/11/24/working-with-large-models-in-entity-framework-part-1.aspx?Redirected=true

Upvotes: 0

Views: 742

Answers (1)

Rob G
Rob G

Reputation: 3526

With lazy loading enabled, it shouldn't take too many resources to build an entity from a unified model, use the few bits you need, and discard the rest when you're done using the entity.

If your entities are particularly simple compared to the backing schema, you could use some SQL views to simplify the entities created by EF, or you could prune them down in the designer yourself, either way should probably give you the performance you need.

Upvotes: 1

Related Questions