Allocating memory for a char array to concatenate a known piece of text and an integer

I want to concatenate a piece of text like "Answer" with a signed integer to give the output "Number 42".

I know how long a piece of text is (14 characters), but I don't know how many characters the string representation of a number will represent.

I assume the worst case scenario, the largest signed 16-bit integer has 5 digits, plus one extra in case it is negative, and the following code is the correct way to do it?

#include <stdio.h>
#include <stdlib.h>

int main()
{
    char *message;

    message = malloc(14*sizeof(char)+(sizeof(int)*5)+1);

    sprintf(message, "The answer is %d", 42);

    puts(message);

    free(message);
}

      

+2


source to share


6 answers


Using:

malloc(14*sizeof(char) /*for the 14 char text*/
       +(sizeof(char)*5) /*for the magnitude of the max number*/
       +1 /* for the sign of the number*/
       +1 /* for NULL char*/
      );

      



Since numbers will be represented as char you should use sizeof (char) instead of sizeof (int).

+7


source


Not really, you only need a few characters, so sizeof(int)

not required.

However, for easily maintainable and portable code, you should have something like:

#define TEXT "The answer is "
#undef CHARS_PER_INT
#if INT_MAX == 32767
    #define CHARS_PER_INT 6
#endif
#if INT_MAX == 2147483647
    #define CHARS_PER_INT 11
#endif
#ifndef CHARS_PER_INT
    #error Suspect system, I have no idea how many chars to allocate for an int.
#endif

int main (void) {
    char *message;

    message = malloc(sizeof(TEXT)+CHARS_PER_INT+1);
    sprintf(message, TEXT "%d", 42);
    puts(message);
    free(message);
    return 0;
}

      



This has several advantages:

  • If you change a line, you only change one thing and one thing. The argument is malloc

    configured automatically.
  • The expression is sizeof(TEXT)+CHARS_PER_INT+1

    evaluated at compile time. Solution c strlen

    will have a runtime cost.
  • If you try to compile your code on a system where integers can cause an overflow, you will be told to do so (correct the code).
  • In fact, you have to assign an extra character to the number, since the largest 16-bit number (in terms of the number of characters) is -32768

    (six characters). You will notice that I still have it +1

    at the end - this is because you need space for the null line terminator.
+3


source


One way to do this (not necessarily recommended), which gives you the exact size of the number in characters, is using the functions themselves stdio

.

For example, if you print a number (somewhere for some reason) before allocating your memory, you can use the format identifier %n

with printf

. %n

prints nothing; rather, you supply it with a pointer to int

, and printf

fill it with how many characters have been written so far.

Another example is snprintf

if available. You pass it the maximum number of characters you want to write to your string, and it returns the number of characters it should have written, not counting the final zero. (Or -1 on error.) So, using a 1-byte dummy string snprintf

can tell you exactly how many characters your number is.

The big advantage of using these functions is that if you decide to change the format of your number (leading 0s, spaces, octal output, long longitudes, etc.), you will not overflow your memory.

If you have GNU extensions for stdio

, you may want to consider using asprintf

. It's exactly the same as sprintf

, except that it allocates memory for you! No assembly required. (Though you need to free it yourself.) But you don't have to rely on this to be portable.

+1


source


malloc((14 + 6 + 1) * sizeof(char));

  • 14 char for string
  • 6 for numbers + sign
  • 1 for '\ 0'

Note. Sizeof (int) gives you the size of the type in bytes. Sizeof (int) == 4 if int is 32bits, 8 if it is 64bits.

0


source


I think the correct formula to get the maximum length of the decimal representation of an integer would be (floor (log10 (INT_MAX)) + 1); you can also abuse the preprocessor like this:

#include <limits.h>
#define TOSTRING_(x) #x
#define TOSTRING(x) TOSTRING_(x)
/* ... */
#define YOUR_MESSAGE "The answer is "
char message[]=YOUR_MESSAGE "+" TOSTRING(INT_MAX);
sprintf(message+sizeof(YOUR_MESSAGE),"%d", 42);

      

which also avoids heap allocation. You can use snprintf for better security, although this is not necessary with this method.

Another trick would be to create a function like this:

size_t GetIntMaxLenght()
{
    const char dummy[]=TOSTRING(INT_MAX);
    return sizeof(dummy)+1;
}

      

if the compiler is smart enough it can strip the dummy var from the compiled code entirely, otherwise it might be wise to declare var as static to avoid re-initializing it every time the function is called.

0


source


Safe approximation for signed int

(number of digits including sign -

):

(CHAR_BIT * sizeof(int) + 1) / 3 + 1

      

Equivalent for unsigned

:

(CHAR_BIT * sizeof(unsigned) + 2) / 3

      

This calculates the number of digits - appends them to both of them to account for the terminator if it allocates space for a null-terminated string.

This slightly overestimates the space required for very long types (and would also overestimate in the unusual case of int

having padding bits), but is a good approximation and has the advantage of being a compile-time constant. CHAR_BIT

provided <limits.h>

.

0


source







All Articles