Math.Round produces different results depending on where it runs

I have the following line of code that produces 2 different results.

Given the following line

var rounded = Math.Round(415 * 0.01f, 1);

      

I would expect this to be rounded to 4.2 all the time. When I run this from a console application, I always get the expected output 4.2. If I put the same line in a unit test and run this test, I end up with 4.1. I also get 4.1 if the string is running on a windows service.

Any thoughts on why this would produce different value or how to get yourself to behave consistently?

Running this from visual studio 2013 on a 64 bit machine with .NET 4.5.

+3


source to share


2 answers


I had a similar problem many years ago. Some time after launch, my application calculated double values ​​with one precision and after a while with a different precision. It was very strange.

After a while, I realized that at some point the application initializes DirectX with the default flags, and the DX changes the FPU precision with double values ​​in turn.



So, as a tip, please check the related code for side effects. (check using DirectX, check using external unmanaged dll files)

PS: There is no magic. Don't be confused. And try to reduce the use of the decimal place. This is the slowest thing over OLE ...

0


source


var rounded = Math.Round(415 * 0.01d, 1);

      

you can use double to replace float



Lack of precision

-1


source







All Articles