Why is 13596 * 0.1 different from 13596/10?

I am just stumbling over something really weird about Javascript.

My script multiplied 13596 by 0.1 and the result was: 1359.6000000000001

We can agree that 0.1 = 1/10, so I tried:

13596/10 = 1359.6

      

I tested it with Firefox and Chrome, same results.

I wondered if this was floating point related, so I tried the following:

13596 * parseFloat(0.1) = 1359.6000000000001

      

Nope.

By the way, it doesn't equal:

(13596*0.1) === (13596/10) => false

      

Does anyone have any idea about this result?

( Here's a JSFiddle. )

+3


source to share


1 answer


Remember that the floating point number you see on the screen does not necessarily match the number that the computer is simulating.

For example using node

:

> 1359.6
1359.6
> (1359.6).toFixed(20)
'1359.59999999999990905053'

      

Node shows you 1359.6

, but when you ask for more precision, you can see that the actual number is not exactly what you saw - it is rounded. The same is true for 0.1:

> (0.1).toFixed(20)
'0.10000000000000000555'

      

Usually when you work with floats, you accept this imprecision and round numbers before displaying them.

Some numbers are represented exactly, but it is safer to assume that the floating point is imprecise. For example, 0.5 can be represented exactly:



> (0.5).toFixed(20)
'0.50000000000000000000'

      

Dividing by 10 actually gives the same result as multiplying by 0.1:

> 13596/10
1359.6
> (13596/10).toFixed(20)
'1359.59999999999990905053'

      

In other languages, splitting integers results in an integer, but in JavaScript all numbers are modeled as floating points.

Usually, when you need to represent decimal numbers exactly, you should use the decimal type, although this is not natively available in JavaScript.

Also it doesn't make sense to use the code parseFloat(0.1)

as it is 0.1

already a float.

+8


source







All Articles