Reputation: 587
I have a web application running on Windows Azure. It is built with ASP.Net 4.0, MVC 3, Entity Framework 4.1, SQL Server 2008 using the Repository pattern.
The app was performing very well until recently. Most of our customers have a few hundred rows of data but one of them is starting to reach 2000+. This has dramatically slowed their page load times (15 - 20 secs on some pages).
We just started using MiniProfiler which indicates we have a very chatty app- with duplicate SQL calls.
In an effort to give as much detail as possible and to figure out how we can do things better, I'll explain some of the stuff are doing.
We have a base controller that has two protected objects (CurrentUser and CurrentCompany). We use these quite a bit in our Actions, but it hits the DB each time. So, I am assuming we need to store these objects in session the first time. Is there a big overhead in lumping around these objects? What about accessing their relationships later (CurrentCompany.Offices.First(), etc)?
We get the data for each page based on these objects: selecting, ordering, filtering their relationships, for instance:
CurrentCompany.Employees.Where(r => r.StatusId = Enums.Statuses.Active);
Here 'CurrentCompany.Employees' returns an EntityCollection but the Where changes it to IEnumerable. I heard that IQueryable is the way to go?
I have also read that EF is great to get things up and running quickly but you must do some tweaking to make sure it performs well with lots of data. From what I have read, it will bring back the entire row even if you only asked for 2 columns?
So, with all that in mind- can someone point out what I should be doing to make this scale for bigger clients. 2000+ rows is not that much after all. Should we be using views/stored procs?
There are plently of resources out there explaining how to get EF setup with basic selects, etc- but nothing really one scalling it once the data set gets bigger.
Any help would be greatly appreciated. Brian
Upvotes: 4
Views: 2574
Reputation: 1
based on my own experience, EF is very slow because it uses LINQ (very slow) i had a personnal web site (a home page which displays multiple kinds of data, so i had to do multiple requests to the database server in LINQ). it was very slow, i tried to tweak the EF .edmx with lazy loading options, etc......the only solution i found was to get rid of linq, and rewrite it in ado.net, it takes more time to code, but i get rid of speed problems..... LINQ requests must be translated to sql, there a lot of steps before it gets to the database, and when it gets data (select*), it translates it to a list of object models in EF. thanks you
Upvotes: 0
Reputation: 1724
Strangely enough the clue might be in the 2000 rows. SQL changes the way it accesses data when the choice passes 0.1% of the dataset. You do not say if you have appropriate indexes on the tables. http://www.sqlteam.com/article/sql-server-indexes-the-basics may help If you run SQL managment studio then there is a missing index feature http://msdn.microsoft.com/en-us/library/ms345524.aspx
HTH
also here http://blogs.msdn.com/b/sqlazure/archive/2010/08/19/10051969.aspx
Upvotes: 3
Reputation: 20792
How much eager loading are you doing? That was one thing we found to be a major performance hit, to solve it we started caching the properties using Entlib Caching Application Block and then merging them into the object from cache vs getting them from the DB
Upvotes: 1