Reputation: 3611
i have a datatable in SQL server which contains 3 columns. 1st is DateTime, and the 2nd & 3rd are both float. Im trying to calc the mean of the difference between columns 2 and 3 between a specific time interval, ie every 15 mins.
I managed to find the following example
SELECT * FROM dbo.data
WHERE CAST(DateTime AS TIME) BETWEEN '14:00' and '14:30'
but im unsure how to now calc the difference between the 2nd and 3rd column from the results of the above query. Id prefer to do this in t-sql unless someone believe performing this in c# would be more appropriate.
Thanks
EDIT
2010-01-04 14:30:00.0000000 1.44377 1.44386
2010-01-04 14:30:00.0000000 1.4438 1.44389
2010-01-04 14:30:00.0000000 1.44377 1.44386
2010-01-05 14:00:01.0000000 1.44258 1.44267
the results id like would be as follows
0.00009 ie 1.44386 - 1.44377
0.00009 1.44389 - 1.4438
0.00009 1.44386 - 1.44377
0.00009 1.44267 - 1.44258
Upvotes: 0
Views: 1061
Reputation: 1832
If you just cast the date column as TIME
you'll end up with multiple dates, which I don't think is your intent (see the results of the first query below to illustrate what I mean)
DECLARE @Date DATETIME
SELECT @Date = '05/15/2012 14:15'
SELECT CAST(@Date AS TIME)
SELECT @Date = '04/15/2012 14:15'
SELECT CAST(@Date AS TIME)
I think the query you want is the following:
SELECT
DATEADD(MINUTE, 15 * (DATEDIFF(MINUTE, 0, [Date]) / 15), 0), -- Assuming [Date] is the DATETIME column, this will give you 15 minute intervals
AVG([Col2] - [Col3])
FROM
dbo.data
GROUP BY
DATEADD(MINUTE, 15 * (DATEDIFF(MINUTE, 0, [Date]) / 15), 0)
If you want to search a specific time frame, just force it in your WHERE
clause:
SELECT
AVG([Col2] - [Col3])
FROM
dbo.data
WHERE
[Date] BETWEEN '05/15/2012 14:00' AND '05/15/2012 14:30'
Upvotes: 1