Reputation: 2101
I'm working in SQL Server 2008. I have an alias column that I'm creating via the following formula:
col1 / col2 * some_number
col1 and col2 are nvarchars in my table, but they're really integers. some_number is an integer. If I blindly do col1 / col2, I will get 0 for most. So, I need to cast them as decimals. I want there to be a maximum of 2 decimal places after the decimal point. Here is what I have currently:
CAST(col1 AS DECIMAL(10,2)) / CAST(col2 AS DECIMAL(10,2)) * 100
However, this returns far more decimal places than just 2. How do I fix my code to return just 2 decimal places?
Upvotes: 0
Views: 4848
Reputation: 69769
According to the documents the scale of the result is given by:
s2 = max(6, s1 + p2 + 1)
where p2 represents the precision of the numerator and s2 represents scale of the denominator
So when dividing one DECIMAL(10, 2) by another you can substitute in values:
s2 = max(6, 2 + 10 + 1) = 13
Which is corroborated with a simple example:
SELECT CAST(1 AS DECIMAL(10,2)) / CAST(1 AS DECIMAL(10,2))
= 1.0000000000000 -- 13 decimal points
You need to use another cast on your result to reduce the scale:
SELECT CAST(CAST(col1 AS DECIMAL(10,2)) / CAST(col2 AS DECIMAL(10,2)) * 100 AS DECIMAL(10, 2))
I'd also suggest if the data type of you column is nvarchar, "but they're really integers" that you just bite the bullet and alter the column data type.
Upvotes: 2