Ambiguity in long integer arithmetic?
Take a look at the following piece of code:
#include <stdio.h>
int main(void)
{
int a;
a = 2147483647;
printf("a + 1 = %d \t sizeof (a + 1) = %lu\n", a + 1, sizeof (a + 1));
printf("a + 1L = %ld \t sizeof (a + 1L) = %lu\n", a + 1L, sizeof (a + 1L));
a = -1;
printf("a + 1 = %d \t sizeof (a + 1) = %lu\n", a + 1, sizeof (a + 1));
printf("a + 1L = %ld \t sizeof (a + 1) = %lu\n", a + 1L, sizeof (a + 1L)); //why a + 1L does not yield long integer ?
return 0;
}
This produces the following result:
a + 1 = -2147483648 sizeof (a + 1) = 4
a + 1L = 2147483648 sizeof (a + 1L) = 8
a + 1 = 0 sizeof (a + 1) = 4
a + 1L = 0 sizeof (a + 1) = 8
Why a + 1L
does the last line return 0 instead of a long integer like 4294967296?
source to share
why doesn't a + 1L on the last line give a long integer like 4294967296?
Since converting int
-1 to long int
results in long int
with a value of -1 and -1 + 1 = 0
.
Conversion -1
to another type will only result 4294967295
if the target type is an unsigned 32-bit type (usually unsigned int
such, usually uint32_t
if specified). But then adding 1 to the value will complete to 0.
Thus, to get 4294967296
, you need an intermediate throw,
(uint32_t)a + 1L
so it -1
converts to uint32_t
with value first 4294967295
and then converts to long
.
source to share
In the first case: 2147483647 is a 32-bit signed value with hexadecimal representation 0x7fffffff. Adding 1 to it gives a 32-bit value, hex 0x80000000, which is -2147483648 in a 32-bit int signal (due to overflow) and a value of 2147483648 when considered a 64-bit signed int.
In the second case: -1 is a 32-bit signed value with hexadecimal representation 0xffffffff. Adding 1 to it gives a 32-bit value, hex 0x00000000, which is 0 in a signed int.
When you add 1 to it in 64 bits, sign expansion occurs first so you actually add 0xFFFFFFFFFFFFFFFF and 0x0000000000000001, the sum is 0 as expected.
There is no ambiguity if you are considering sign expansion.
source to share