#3794: Complier warnings on OS X 10.11.1 with Xcode 7.1.1
---------------------+---------------------
  Reporter:  chdiza  |      Owner:  brendan
      Type:  defect  |     Status:  new
  Priority:  minor   |  Milestone:
 Component:  SMTP    |    Version:
Resolution:          |   Keywords:
---------------------+---------------------

Comment (by code@…):

 {{{
 In regards to Comment #3, pertainting to the original code snippet:

     while(a && *a > 0 && *a < 128)

 "Some quick Googling seems to say that whether "char *" is signed or
 unsigned is implementation dependent."

 The problem has nothing to do with the value of char * (i.e. the value
 of the variable a).  The problem is the value of *a, i.e. the type of
 data that a points to, which is char.  A char is signed on every
 platform.  This is not a harmless warning, IT IS A BUG.  This is
 illustrated by the following tiny program:

 #include <stdio.h>
 int main(int argc, char **argv)
 {
     char x;
     x = 128;
     printf("%d\n", (int)x);
     return 0;
 }

 On any sane platform, this will print -128, not 128.  That's because
 the value of x is, in fact, -128; the compiler has performed an
 implicit cast (I personally would like to see a warning here--this is
 a hard-to-spot bug, unless you really intended for x to be -128, in
 which case you should have written it that way).

 The general gist of the problem is that char is technically a
 (SIGNED!) numeric data type, but is most commonly used to contain
 character data in ASCII encoding.  Character data is inherently
 unsigned, i.e. ASCII code 128 is not a signed value, however the value
 of a signed char with the equivalent bits set is -128.  So if you're
 going to compare character data to its unsigned ASCII code, your data
 type needs to either be unsigned inherently (unsigned char) or you
 need to cast it to unsigned data when you do your comparison.

 Vincent's proposed fix is technically correct, but is pretty
 unreadable, possibly obscuring the intent to the reader.  A better fix
 that retains the intent would have been (almost) what Petr suggested,
 except he had the sign wrong, and there's no longer any reason to
 compare to 0, since unsigned chars can never be negative:

     while (a && *((unsigned char *)a) < 128)

 However, if you're going to be comparing chars to numeric literal
 ASCII codes, you should really consider whether the data should
 actually be unsigned char instead of just char.  The main reason NOT
 to do that is if you have to use it with old POSIX API calls that got
 the sign wrong, expectng a signed char* rather than an unsigned char*,
 of which there are a number (like strcmp() et al.).  In that case,
 you'll end up doing far more casting with the "right" signedness than
 without, so just cast the uncommon case.

 }}}

-- 
Ticket URL: <http://dev.mutt.org/trac/ticket/3794#comment:11>
Mutt <http://www.mutt.org/>
The Mutt mail user agent

Reply via email to