Reputation: 13366
I know it's probably not enough to be worried about, but how performant is the DBNull.Value.Equals() check?
public IEnumerable<dynamic> Query(string sql, params object[] args)
{
using (var conn = OpenConnection())
{
var rdr = CreateCommand(sql, conn, args).ExecuteReader(CommandBehavior.CloseConnection);
while (rdr.Read())
{
var e = new ExpandoObject();
var d = e as IDictionary<string, object>;
for (var i = 0; i < rdr.FieldCount; i++)
d.Add(rdr.GetName(i), DBNull.Value.Equals(rdr[i]) ? null : rdr[i]);
yield return e;
}
}
}
in particular, this line:
d.Add(rdr.GetName(i), DBNull.Value.Equals(rdr[i]) ? null : rdr[i]);
versus the original code (from Rob Conery's Massive class):
d.Add(rdr.GetName(i), rdr[i]);
There's bound to be at least a small impact, again probably not truly noticable, but I'm curious. The reason for the conversion is because it's much easier testing for null in ASP.NET MVC views.
Upvotes: 2
Views: 2498
Reputation: 27105
If you look in .NET reflector you can see that a DBNull object does not have any fields. There is always one instance of DBNull (the static Value field). Furthermore, the Equals method is not overriden in the DBNull class. This means the Object.Equals is called which will do an external method call that checks for reference equality.
Conclusion: this call is comparing two pointers and the performance impact is not going to be an issue in any situation, it's like comparing two integer values.
Upvotes: 4