Reputation: 13
I am wondering about the behavioral difference that can be seen in output for Test 2 on example below.
I'm on .NET Core 3.1 for this scenario.
using System;
public class Program
{
public static void Main()
{
var a = new Customer();
if (a.BirthDate == default)
Console.WriteLine("Test 1 Result: default");
else
Console.WriteLine("Test 1 Result: NOT default");
if (a?.BirthDate == default)
Console.WriteLine("Test 2 Result: default");
else
Console.WriteLine("Test 2 Result: NOT default");
if (a.BirthDate == default(DateTime))
Console.WriteLine("Test 3 Result: default");
else
Console.WriteLine("Test 3 Result: NOT default");
if (a?.BirthDate == default(DateTime))
Console.WriteLine("Test 4 Result: default");
else
Console.WriteLine("Test 4 Result: NOT default");
}
}
public class Customer
{
public DateTime BirthDate {get;set;}
}
Output:
Test 1 Result: default
Test 2 Result: NOT default
Test 3 Result: default
Test 4 Result: default
I was expecting all the outputs would be "default".
Upvotes: 1
Views: 120
Reputation: 27011
The type of default
literal is inferred by the compiler. In the second test, the inferred type is Nullable<DateTime>
, so it's equivalent to
a?.BirthDate == default(Nullable<DateTime>)
The default value of a Nullable<T>
type is null
, you can check with the following code
Console.WriteLine(default(Nullable<DateTime>) == null); // true
Because the left operand is a real DateTime
object, the comparison result is false.
Upvotes: 3