Passing argument 1 'strlen' differs in signature
I am using invocation strlen()
throughout my project, so far I have compiled my project with no -Wall
compiler option . But when I start using -Wall
, I come across so many compiler messages. 80% is a warning strlen char * vs const char *.
I am aware that the type makes all calls strlen()
. Is there any other way that I can suppress the following warning?
./Proj.c:3126: warning: pointer targets in passing argument 1 of
'strlen' differ in signedness`
C:/staging/usr/include/string.h:397: note: expected 'const char *' but
argument is of type 'unsigned char *'`
source to share
strlen
takes as input const char*
.
Unfortunately, the C standard states that signature char
boils down to the compiler and platform. Therefore, many programmers prefer to explicitly set the sign char
using signed char
or unsigned char
.
But this will raise a warning if it char*
has a different value convention that you expect.
Fortunately, in context, strlen
using C style is safe: usestrlen((const char*)...);
source to share
There is always a possibility:
inline size_t u_strlen(unsigned char * array)
{
return strlen((const char*)array);
}
This way you don't need to add conversion to your code.
Although the question remains, why are you using unsigned char? I suppose this is a byte array for data packets over the network, in which case you have to take care of the length in the protocol.
source to share
It's not about char*
vs const char*
, it's not the issue that is reported (because it's not a problem). The problem is with what you are using unsigned char*
. Whether signed char
or unsigned is implementation dependent; so it unsigned char*
will be used like on some platforms char*
and not on others.
The best solution is to enforce type consistency without defining your strings and string pointers as unsigned
; it almost certainly serves no useful purpose. For strings and characters, the distinction between signed and unsigned is irrelevant - it's only interesting when doing arithmetic and using it char
as a "small integer".
Most compilers support a command line switch to specify a standard signature char
; however, I would not recommend this as a solution and I would not recommend casting; the correct type convention should always be your first choice.
source to share