lnjblue
lnjblue

Reputation: 150

SQL loses decimals after multiplication

I have a table which includes:

COUNT    RISK
35       0.6456000000
11       0.5234000000
4        0.8431000000

I need a column to multiply the two columns. However I'm getting the result of:

TOTAL
35
11
4  

COUNT - INT

RISK - VARCHAR

SQL is clearly rounding up the decimals as 1. I've tried casting as decimal, numeric and multiplying by 1.0. I need to retain the decimals for an actual calculation. Any help would be great

Upvotes: 3

Views: 2860

Answers (2)

jyao
jyao

Reputation: 1630

This question is really suspicious. From the surface, it seems the two columns [Count] and [Risk] have different data types with [Count] as integer and [Risk] as decimal or float. According to BOL, decimal/float data type has higher precedence, I will quote the BOL here

When an operator combines two expressions of different data types, the rules for data type precedence specify that the data type with the lower precedence is converted to the data type with the higher precedence. If the conversion is not a supported implicit conversion, an error is returned. When both operand expressions have the same data type, the result of the operation has that data type

So to me, in SQL Server, when you do

Select [Total]=[Count]*[Risk] from [your_table]

You cannot get the result as shown in the original question.

Upvotes: 2

Kemal Güler
Kemal Güler

Reputation: 614

Convert result to decimal like this

SELECT 
   CONVERT(DECIMAL(16,10), COUNT * RISK) AS DecimalResult
FROM dbo.whatever;

Or convert COUNT to decimal

SELECT CAST(COUNT AS DECIMAL(16,10)) * RISK

Upvotes: 2

Related Questions