Fabien COELHO <coe...@cri.ensmp.fr> writes: > I understood the point and I do not see real disadvantages. The C standard > really says that an enum is an int, and compilers just do that.
No, it doesn't say that, and compilers don't just do that. A compiler is specifically allowed to store an enum in char or short if the enum's declared values would all fit into that width. (Admittedly, if you're choosing the values as powers of 2, an OR of them would still fit; but if you think "oh, an enum is just an int", you will get burned.) More to the point, once you allow OR'd values then none of the practical benefits of an enum still hold good. The typical things that I rely on in an enum that you don't get from a collection of #define's are: * compiler warnings if you forget some members of the enum in a switch * debugger ability to print variables symbolically Those benefits both go up in smoke as soon as you allow OR'd values. At that point you might as well use the #defines rather than playing language lawyer about whether what you're doing meets the letter of the spec. I note that C99 specifically mentions this as something a compiler might warn about: -- A value is given to an object of an enumeration type other than by assignment of an enumeration constant that is a member of that type, or an enumeration variable that has the same type, or the value of a function that returns the same enumeration type (6.7.2.2). which certainly looks like they don't consider "enumvar = ENUMVAL1 | ENUMVAL2" to be strictly kosher. regards, tom lane -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers