Why does dividing two integers return 0.00?

Every time I run this program, I get different and strange results. Why is this?

#include <stdio.h>

int main(void) {
    int a = 5, b = 2;
    printf("%.2f", a/b);
    return 0;
}

      

+3


source to share


4 answers


printf("%.2f", a/b);

      

The division output is again of type int

, not float

.



  • You are using the wrong format specifier which will result in undefined behavior.
  • You must have floats to perform the operation.

The right format specifier for printing int

is -%d

+12


source


In your code, a

u b

are of type int

, so division is essentially an integer division that results in int

.

You cannot use an invalid format specifier at any time. %f

requires the corresponding argument to be of type double

. You need to use %d

for type int

.

FWIW, using an invalid format specifier, causes undefined behavior .

From the standard C11

, chapter §7.21.6.1,fprintf()



If any argument is not the correct type for the corresponding conversion specification, the behavior is undefined.

If you want floating point division, you need to do it explicitly with

  • advances one of the variables before division to provide floating point division, the result of which will be floating point.

    printf("%.2f", (float)a/b);
    
          

  • use float

    for a

    and b

    .
+7


source


You need to change the type to be float or double.

Something like that:

printf("%.2f", (float)a/b);

      

IDEONE DEMO

%f

format specifier for float

. Using the wrong format specifier will lead you to undefined behavior. Splitting an int by an int will give you an int.

+3


source


Use this instead of your printf ()

printf("%.2lf",(double)a/b);

      

0


source







All Articles