Multiplication of two long numbers

I tried to multiply by numbers i.e. 10000

and 10000 + 1

through the C program. But I am not getting the correct output.

printf("%lld",(100000)*(100001));

      

I've tried the above code for different compilers, but I get the same ones 1410165408

instead 10000100000

.

+3


source to share


4 answers


Well let them multiply

  int64_t a = 100000;
  int64_t b = 100001;
  int64_t c = a * b;

      

And we get (binary)

     1001010100000011010110101010100000 /* 10000100000 decimal */

      

but if you convert it to int32_t



  int32_t d = (int32_t) c;

      

you will only get the last 32 bits (and throw away the top 10

):

       01010100000011010110101010100000 /* 1410165408 decimal */

      

The simplest way out is probably to declare both constants as 64-bit values ​​(the LL

suffix means long long

):

  printf("%lld",(100000LL)*(100001LL));  

      

+6


source


In C, the type that is used for computation is determined by the type of the operands, not by the type in which you store the result.

Plain integer constants such as 100000

are of type int

because they will fit into one. However, multiplication is 100000 * 100001

not appropriate, so you end up with integer overflow and undefined behavior. Switching to long

doesn't necessarily solve anything, because it could be 32 bits too.

Also, printing int

with a format specifier %lld

is undefined for most systems as well.



The root of all evil here is the crappy default types in C (called "primitive data types"). Just get rid of them and their ambiguity and all your mistakes will disappear with them:

#include <stdio.h>
#include <inttypes.h>

int main(void) 
{
  printf("%"PRIu64, (uint64_t)100000 * (uint64_t)100001);
  return 0;
}

      

Or equivalent: UINT64_C(100000) * UINT64_C(100001)

.

+5


source


Your two integers int

that will do the result as well int

. What the format specifier printf()

says %lld

it needs long long int

is irrelevant.

You can use or use suffixes:

printf("%lld", 100000LL * 100001LL);

      

Will print 10000100000

. Of course, there is still a limit, since the number of bits in long long int

is still constant.

+4


source


You can do it like this:

long long int a = 100000;
long long int b = 100001;
printf("%lld",(a)*(b));

      

this will give the correct answer.

What you are doing is (100000)*(100001)

ie by default the compiler takes 100000

in an integer and multiplies 100001

and stores it in (int)

But during printf it prints (int) as (long long int)

+1


source







All Articles