Reputation: 86729
I have high precision dates stored in an SQL server, e.g.
2009-09-15 19:43:43.910
However when I convert that value into a DateTime the miliseconds value of the resulting DateTime value is 0:
reader["Timestamp"] = 15/09/2009 19:43:43.000
Having these DateTime values in precision down to milliseconds is very important to me - what is the best way of doing this?
UPDATE: This is the code that performs the conversion:
DateTime myDate = (DateTime)reader[Timestamp"];
There is nothing special about the SELECT
statement, in fact it is a SELECT *
- no fancy casts or anything
It appears that the DateTime object returned by the SqlDataReader
simply is not populated with the Millisecond value
Upvotes: 17
Views: 33727
Reputation: 1581
I had the same problem, and I solved it by persisting the C# DateTime as a SQL bigint populated with DateTime.Ticks. This preserves the full DateTime precision. And of course can be de-serialized with the DateTime(long ticks) constructor.
Upvotes: 0
Reputation: 281
I had this same problem and after some reading it turns out that when you retrieve the date as you were doing
DateTime myDate = (DateTime)reader["Timestamp"];
the SQLDataReader drops the milliseconds. However if you use the GetDateTime method of the SQLDataReader it returns a DateTime object which preserves the milliseconds:
reader.GetDateTime(reader.GetOrdinal("Timestamp"));
Upvotes: 20
Reputation: 46683
Here's how I'd try to troubleshoot this:
step through in the debugger and look at the type and value of reader[Timestamp"]. If the type is not SqlDateTime, that would make me suspicious-- I'd then look at the query to figure out why that column was returning another type instead of DATETIME (or DATETIME2, etc.)
if that value is SqlDateTime and it does contains milliseconds, then I'd look at the cast as the source of the problem. To validate this, I'd try (in debugger or code) SqlDataReader.GetDateTime() and SqlDataReader.GetSqlDateTime() to see if either returns the correct result. This admittedly seems really unlikely as source of the problem-- casting should work fine.
if that value from #1 is SqlDateTime but contains no milliseconds, then I'd look to an upstream problem in the database-- in other words your query is returning something without milliseconds. when you execute the exact same query in Management Studio, do you see milliseconds?
My guess is this is a query-related issue. But am intrigued to find our more.
Upvotes: -1
Reputation: 28325
That is because the default format string for DateTime doesn't include milliseconds.
If you use a custom format, you will see the milliseconds value.
An example:
public class Program
{
private static string connString = @"Data Source=(local);Initial Catalog=DBTest;Integrated Security=SSPI;";
public static void Main(string[] args)
{
using (SqlConnection conn = new SqlConnection(connString))
{
conn.Open();
using (SqlCommand cmd = new SqlCommand("SELECT * FROM MilliSeconds"))
{
cmd.Connection = conn;
SqlDataReader reader = cmd.ExecuteReader();
while(reader.Read())
{
DateTime dt = (DateTime)reader["TimeCollected"];
int milliSeconds = dt.Millisecond;
Console.WriteLine(dt.ToString("yyyy-MM-dd HH:mm:ss.fff"));
}
}
}
Console.ReadLine();
}
}
From a database with these values:
1 2009-09-22 18:11:12.057
2 2009-09-22 18:11:28.587
3 2009-09-22 18:11:29.820
The resulting output from the code above is:
2009-09-22 18:11:12.057
2009-09-22 18:11:28.587
2009-09-22 18:11:29.820
Upvotes: 6
Reputation: 10620
Maybe this (Difference between DateTime in c# and DateTime in SQL server) will help a little.
Upvotes: 6