Reputation: 71
Lets assume we have table as below. AlbumID and CoverType are integers with 0 as default value. In addition to AlbumID and CoverType there are other 100+ columns.
AlbumID CoverType
1 0
2 0
3 1
4 8
6 0
In the example below, AlbumId which equals 400 do not exist.
var coverType =
(
from a in Albums
where
a.AlbumId == 400
select a.CoverType
).FirstOrDefault();
As you can see I am selecting only one column and therefore var coverType is equal to zero. How I can check if this is because coverType is really zero or because row was not found? I need to distinguish between those situations
I know I can do following
var covertType =
(
from a in Albums
where
a.AlbumId == 400
select a
).FirstOrDefault();
if (coverType == null)
{
}
but I would like to keep only one column in select statement (performance) and not the whole row (100+ columns). The only solution at the moment I can think about is to select at least two columns and check if coverType is null or not. However I would like to know if this is also possible with only one column selected?
Upvotes: 1
Views: 22
Reputation: 156624
FirstOrDefault()
will give you the default value for the type you're selecting when no value is present. Since your ID column isn't nullable, that default value is 0
. You could cast it to a nullable value instead.
var coverType =
(
from a in Albums
where
a.AlbumId == 400
select (int?)a.CoverType
).FirstOrDefault();
Or you could select an object that only has this one field on it as a property.
var result =
(
from a in Albums
where
a.AlbumId == 400
select new { a.CoverType }
).FirstOrDefault();
if (result != null)
{
var coverType = result.CoverType;
...
}
Upvotes: 1