On 12/12/2014 03:18 AM, Ian Romanick wrote: > On 12/11/2014 02:34 PM, Eduardo Lima Mitev wrote: >> Stencil value masks values (ctx->Stencil.ValueMask[]) stores GLuint values >> which are initialized with max unsigned integer (~0u). When these values >> are queried by glGet* (GL_STENCIL_VALUE_MASK or GL_STENCIL_BACK_VALUE_MASK), >> they are converted to a signed integer. Currently, these values overflow >> and return incorrect result (-1). >> >> This patch clamps these values to max int (0x7FFFFFFF) before storing. > > This feels wrong. Is there some justification in the spec for this > behavior? >
Hi Ian, 4.1.4 Stencil Test section of the GLES3 spec says this: "In the initial state, stenciling is disabled, the front and back stencil reference value are both zero, the front and back stencil comparison functions are both ALWAYS , and *the front and back stencil mask are both set to the value 2 s − 1, where s is greater than or equal to the number of bits in the deepest stencil buffer* supported by the GL implementation." Looking at the formats supported by Mesa, it seems that the maximum number of bits ever used for stencil values is 8. If that is correct, then it means the mask value should be initialized to 0xFF instead of 0xFFFFFFFF (~0u), which would fix the problem because that will not overflow when converted to 32 bits signed int. cheers, Eduardo _______________________________________________ mesa-dev mailing list mesa-dev@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/mesa-dev