Reputation: 788
I became aware of a performance difference when using parameterized queries generated by EntityFramework when using an Azure SQL database.
I have a table with a varchar(30) primary key and when I try to get a value from that table using the primary key, EntityFramework creates a parameterized query using NVARCHAR(4000) as the datatype.
DECLARE @p__linq__0 nvarchar(4000)
SET @p__linq__0 ='ehenurqp0kpql76kjsw3'
select * from mediacontentreferences where mediacontentreferenceid=@p__linq__0
This generates a very strange execution plan on the Azure SQL database.
It is quite ineffective since it uses an Index Scan that looks through the whole table.
If I use a parameter with the correct datatype, the execution plan uses the primary key index.
DECLARE @p__linq__1 varchar(30)
SET @p__linq__1 ='ehenurqp0kpql76kjsw3'
select * from mediacontentreferences where mediacontentreferenceid=@p__linq__1
If I use the first query on an on-premises SQL server, the execution plan converts the NVARCHAR(4000) to a VARCHAR(30) and uses the primary key index.
This looks like a flaw in the execution plan calculation on the Azure SQL server.
Is there a possibility to change the behavior of Entity Framework on how it creates the queries?
I have read a couple articles regarding this but no solution where found there.
Why does code first/EF use 'nvarchar(4000)' for strings in the raw SQL command?
Why does Entity Framework generate large parameters? How can they be reduced?
Upvotes: 3
Views: 768
Reputation: 4408
You must specify right datatype for your column, than EF while generating queries will declare parameters with a right data type. Just add this to your DbContext:
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Entity<mediacontentreferences >().Property(e => e.mediacontentreferenceid).HasColumnType("varchar");
}
Upvotes: 2