Reputation: 3539
I'm running a query which returns decimals, then inserting those decimals into a table variable. When I query the table variable I get integers. Can anyone see why this is happening and how to correct? Using Sql Server 2000.
DECLARE @Nov Table(custCode varchar(10), PromiseAvg decimal, ShipAvg decimal )
INSERT INTO @Nov
SELECT JM.CustomerCode
, isnull(AVG(Cast(DATEDIFF(dd, ISNULL(startDate, 0), ISNULL(PromiseDate, 0)) As Decimal)),0) As PromiseAvg
, isnull(AVG(Cast(DATEDIFF(dd, ISNULL(startDate, 0), ISNULL(BOL.ShipDate, 0)) As Decimal)),0) As ShipAvg
from jobitems JI
LEFT JOIN jobmaster JM ON JI.Job_Number = JM.JobNumber
LEFT JOIN dbo.NCL_BOLDetails BD ON JM.JobNumber = BD.JobNumber AND JI.Item_Code = BD.ItemNumber
INNER JOIN dbo.NCL_BOLs BOL ON BD.BOLID = BOL.BOLID
WHERE StartDate BETWEEN '20091101' AND '20091130'
Group By JM.CustomerCode, JM.CustLongName
Order By JM.CustomerCode
Select * from @Nov
Upvotes: 4
Views: 1177
Reputation: 453348
You need to define a scale and precision for your decimals in the table variable.
e.g.
DECLARE @Nov TABLE(
custCode VARCHAR(10),
PromiseAvg decimal(18,8),
ShipAvg decimal(18,8) )
or whatever scale and precision is appropriate for you. I just tested on SQL Server 2008 and it looks as though decimal(18,0)
is the default if this is not specified in a create table
statement.
Also do the same with your cast
s
...
isnull(AVG(Cast(DATEDIFF(dd, ISNULL(startDate, 0), ISNULL(PromiseDate, 0))
As decimal(18,8))),0) As PromiseAvg,
....
Upvotes: 6