# How does the expression "(a / b) * b + a% b - a" always equal zero in C for non-zero 'b'?

Below is an excerpt from a book by C Mike Banahan and Brady (link: section 2.8.2.1) . A plebs like me has no reason to doubt that the author is wrong unless you suggest otherwise.

Please tell me how on earth **"(a / b) * b + a% b - a is** always zero for integers, where b is not zero.

The extracted text follows:

If any of the operands is negative, the result of / can be the closest integer with a true result on either side and the sign of the result% can be positive or negative. Both of these functions are implementation defined.

It is always true that the following expression is zero:

`(a/b)*b + a%b - a`

if b is not zero.

Normal arithmetic conversions apply to both operands.

source to share

This is true by the definition of the operator `%`

in C.

The definition of the remainder operator in the C standard says:

(C11, 6.5.5p6) "If the factor a / b is representable, the expression (a / b) * b + a% b must be equal to a;"

Also note that for both `/`

and `%`

, if the second operand `0`

, the standard specifies the operator undefined.

source to share

Mathematically...

On paper (a / b) * b == a (b is canceled) so the result looks funny.

However, the computer calculates (a / b) first and then multiplies by b. If this is done in integer arithmetic, then a / b is potentially rounded before multiplication.

If a <b then the result of a / b is 0, a% b is a, giving 0 + a - a == 0

if a> b then (a / b) * b == floor (a / b) and (a / b) * b + a% b == a, again giving 0.

Essentially, this is a check that the compiler performs integer arithmetic correctly.

source to share