Reputation: 1847
I em using Linq to EF and trying to get FirstOrDefault entitiy from ObjectSet. The query looks like this :
Notification not = new Notification();
........
//not.SubTypeID = null;
var elem = ent.Notifications.FirstOrDefault(p =>
p.ID == not.ID &&
p.SubTypeID == not.SubTypeID &&
p.Location == not.Location &&
p.TypeID == ns.TypeID
);
Sometimes SubTypeID
can be null
and in this case nothing is returnced in elem
althogh p.SubTypeID
and not.SubTypeID
are both null
.
But the strange thing is that when I run this query :
var elem = ent.Notifications.FirstOrDefault(p =>
p.ID == not.ID &&
p.SubTypeID == null &&
p.Location == not.Location &&
p.TypeID == ns.TypeID
);
everything works as expected and I see data in elem
.
So what I am doing wrong and what is the difference between these 2 queries when not.SubTypeID
is null
.
Upvotes: 0
Views: 755
Reputation: 169488
In C#, the result of null == null
is true. But in SQL, which this expression tree is getting compiled into, the result of NULL = NULL
is NULL
. Example on PostgreSQL:
SELECT 1 WHERE NULL = NULL;
?column?
----------
(0 rows)
You might consider using an expression like:
p.SubTypeID == not.SubTypeID ||
(p.SubTypeID == null && not.SubTypeID == null)
This is kind of a hack, and there may be a better solution than this. But the NULL = NULL
test is likely the root of your problem.
Upvotes: 1