Why does dividing two integers return 0.00?
In your code, a
u b
are of type int
, so division is essentially an integer division that results in int
.
You cannot use an invalid format specifier at any time. %f
requires the corresponding argument to be of type double
. You need to use %d
for type int
.
FWIW, using an invalid format specifier, causes undefined behavior .
From the standard C11
, chapter §7.21.6.1,fprintf()
If any argument is not the correct type for the corresponding conversion specification, the behavior is undefined.
If you want floating point division, you need to do it explicitly with
-
advances one of the variables before division to provide floating point division, the result of which will be floating point.
printf("%.2f", (float)a/b);
- use
float
fora
andb
.
source to share