Reputation: 8919
I want to write a T-SQL function that converts a SQL Server datetime value into the number of milliseconds between the input date and 01 JANUARY 1970 00:00:00, the reference date for the javascript Date datatype.
create function jDate
(@indate datetime)
returns numeric
as
begin
declare @datediff int;
declare @javascriptDay1 DateTime;
set @javascriptDay1 = cast('19700101 00:00:00.000' as DateTime);
set @datediff = DateDiff( ms, @javascriptDay1, @indate)
return (@datediff);
end
Thus, if I feed the function 01 JAN 1970, it should return 0. Which it does:
declare @indate datetime
set @indate = cast('19700101 00:00:00.000' as datetime)
select dbo.jDate(@indate)
0
If I feed it 02 JAN 1970 00:00:00.000 it should returns the number of milliseconds in one day, 86400000, which it does:
declare @indate datetime
set @indate = cast('19700102 00:00:00.000' as datetime)
select dbo.jDate(@indate)
If I feed it 31 DEC 1969 23:59:59 it should return 1000, the number of milliseconds in one second, which it does.
declare @indate datetime
set @indate = cast('19691231 23:59:59.000' as datetime)
select dbo.jDate(@indate)
But if I feed it 01 JAN 1970 00:00:00.088 (i.e. only a few milliseconds difference) it should return the number of milliseconds, 88. But it returns 86.
declare @indate datetime
set @indate = cast('19700101 00:00:00.088' as datetime)
select dbo.jDate(@indate)
What causes the two millisecond error?
Upvotes: 0
Views: 382
Reputation: 38130
Per the documentation, the resolution of the DATETIME
data type is rounded to increments of .000, .003, or .007 seconds.
For a higher precision, consider the DATETIME2
data type.
Upvotes: 1