Reputation: 2832
I have a Linq to SQL query that behaves (in my opinion) very strange when I check for null values.
There is a record in the DB as shown by the last Linq, but why does the 1st two queries not show the record?
//Check
(record.SomeID == null ? "Yes" : "No"); //This prints Yes
//LINQ
var q = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == record.SomeID
select t).FirstOrDefault();
//This prints nothing. I.e. no records found
var q2 = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == (record.SomeID == null ? null : record.SomeID)
select t).FirstOrDefault();
//This also prints nothing. I.e. no records found
var q3 = (from t in mySQLTable
where t.PKID == record.PKID &&
t.SomeID == null
select t).FirstOrDefault();
//This prints 1 record
Upvotes: 0
Views: 66
Reputation: 30022
You can overcome this issue using the below query:
bool isNull = record.SomeID == null;
var q = (from t in mySQLTable
where t.PKID == record.PKID
&& ( (isNull && t.SomeID == null)
||
(!isNull && t.SomeID == record.SomeID)
)
select t).FirstOrDefault();
Upvotes: 1