Reputation: 7173
I am new to LINQ and am trying to create some data points from a table to graph. The three fields of importance in this table are the id, the time and the value. I am writing a query to get the average value over a set time for a chosen id. The LINQ I have written follows:
var value = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select t.Value).Average();
However this crashes at runtime with:
"The null value cannot be assigned to a member with type System.Decimal which is a non-nullable value type.."
At certain intervals there is no data so the SQL LINQ generates returns null, which I would liked to be COALESCED to 0 but instead crashes the application. Is there a way to write this LINQ query to be able to handle this properly?
The table definition to make things clearer:
[Serializable]
[Table(Name = "ExampleTable")]
public class ExampleTable
{
[Column(Name = "Id")]
public int Id { get; set; }
[Column(Name = "Time")]
public DateTime Time { get; set; }
[Column(Name = "Value")]
public int Value{ get; set; }
}
Upvotes: 17
Views: 17476
Reputation: 1500065
Okay, how about this:
var value = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select t.Value).DefaultIfEmpty().Average()
I believe that's logically what you want - changing {} to {0}, so making all averages achievable. I don't know if it'll do what you want in terms of SQL though.
Upvotes: 18
Reputation: 15515
I think you want
var value = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select (int?)t.Value).Average()
This way, you get a double?
back, whereas without the (int?)
cast you need to get a double
back, which cannot be null
.
This is because of the signatures
double Enumerable.Average(IEnumerable<int> source)
double? Enumerable.Average(IEnumerable<int?> source)
Now, to get an average of 0 instead of null, you need to place the coalescing operator at the end
var value = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select (int?)t.Value).Average() ?? 0.0;
IMHO this is a pretty awful design of the Enumerable
/Queryable
class; why can't Average(IEnumerable<int>)
return double?
, why only for Average(IEnumerable<int?>)
?
Upvotes: 22
Reputation: 2862
Could you use a temp for the initial query?
E.g:
var temp = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select t.Value) ?? new List<int>() {0};
var value = temp.Average();
Not sure if this helps.
Upvotes: 0
Reputation: 147260
Try the following. It will simply skip all the null items returned by the query.
var value = (from t in _table
where t != null
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select t.Value).Average();
If you want to explicitly treat the null items as zero, then a simple use of the conditional operator should do the job:
var value = (from t in _table
where t == null ||
(t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime)
select t == null ? 0 : t.Value).Average();
Upvotes: 0
Reputation: 10561
EDIT: Total Rework
Try casting the value to nullable first
var value = (from t in _table
where t.Id == id
&& t.Time >= intervalStartTime
&& t.Time <= intervalEndTime
select ((int?)t.Value) ?? 0).Average()
Upvotes: 1