T-sql decimal assignment value change value

Why does the select statement below return two different values?

declare @tempDec decimal
set @tempDec = 1.0 / (1.0 + 1.0)
select @tempDec, 1.0 / (1.0 + 1.0)

      

+1


source to share


2 answers


This works great for literals like 1.0, but if you are pulling data from table columns, you need to do the conversion of the first evaluated number in your equation:

convert(decimal, [col1]) / ([col2] + [col3])

      



-or -

convert(decimal(15, 2), [col1]) / ([col2] + [col3])

      

+3


source


I learned from a colleague in the same way I posted this.

You need to specify the default precision and scale.

This works in this scenario: declare @tempDec decimal (3,2)

From MSDN:



decimal [(p [, s])] and numeric [(p [, s])] Fixed precision and scale numbers. When maximum precision is used, valid values ​​are -10 ^ 38 +1 to 10 ^ 38-1. SQL-92 synonyms for decimal are dec and dec (p, s). a numeric value is functionally equivalent to a decimal value.

p (precision) The maximum total number of decimal digits that can be stored to the left or right of the decimal point. The precision must be a value between 1 and maximum precision 38. The default precision is 18.

s (scale) The maximum number of decimal digits that can be stored to the right of the decimal point. The scale must be a value between 0 and p. The scale can only be specified if the precision is specified. The default scale is 0; therefore 0 <= s <= p. The maximum storage sizes depend on the accuracy.

+1


source







All Articles