Reputation: 61636
I have the following query:
-- CTE to remove outliers, e.g. remove the fastest and slowest results
;WITH MinMaxCTE AS
(
SELECT ServerName, CONVERT(VARCHAR(10), UpdatedOn, 101) AS [Date], Version,
MIN(JaguarStartupTime) AS MinStartTime, MAX(JaguarStartupTime) AS MaxStartTime
FROM dbo.MachineConfiguration (NOLOCK)
WHERE DomainLogin NOT IN (SELECT DomainLogin FROM dbo.SupportGroup)
GROUP BY ServerName, CONVERT(VARCHAR(10), UpdatedOn, 101), Version
)
SELECT AVG(mc.JaguarStartupTime) AS AverageTime
, COUNT(*) AS NumEntries
, mc.Version
FROM #Eligible mc (NOLOCK)
JOIN MinMaxCTE cte ON mc.ServerName = cte.ServerName
AND CONVERT(VARCHAR(10), mc.UpdatedOn, 101) = cte.[Date]
AND mc.Version = cte.Version
AND mc.JaguarStartupTime <> cte.MinStartTime
AND mc.JaguarStartupTime <> cte.MaxStartTime
GROUP BY mc.Version
ORDER BY Version DESC, AVG(mc.JaguarStartupTime) ASC
The definition of the #Eligible temp table is
create table #Eligible (
Version nvarchar(50), JaguarStartupTime int,
ServerName nvarchar(50), UpdatedOn datetime )
No matter what condition or aggregation I comment out, I always get the following error: Arithmetic overflow error converting expression to data type int
.
Where can I go from here? How do I debug this further?
EDIT: Sample data
Version JaguarStartupTime ServerName UpdatedOn
6.4.6.082 16040 NewOrleansLA 2012-08-08 12:34:12.330
6.5.1.012 40390 BatonRougeLA 2012-08-08 18:33:17.440
6.5.1.012 48379 HonoluluHI 2012-08-09 04:42:50.453
Upvotes: 5
Views: 6027
Reputation: 4188
Have you tried casting your jaguarstartup times to a bigint in your avg aggregate like so...
AVG(CAST(mc.JaguarStartupTime AS BIGINT))
This should sort out the arithmetic overflow.
To calculate the mean average the server needs to be able to sum all of the ints first, so the datatype you are averaging on needs to able to store the sum of those values, even if the returned answer is within the range of of an int
Upvotes: 5