On 12/15/2014 08:30 PM, Ian Romanick wrote: > On 12/15/2014 08:04 AM, Eduardo Lima Mitev wrote: >> >> Since the maximum supported precision for stencil buffers is 8 bits, mask >> values should be initialized to 2^8 - 1 = 0xFF. >> >> Currently, these masks are initialized to max unsigned integer (~0u), which >> causes their values to overflow to -1 when converted to signed int by glGet* >> APIs. > > I did some research on this... before desktop OpenGL 3.1, the spec said > something quite different. Please add the following to the commit message: > > "In OpenGL 3.0 and before, the an initial value of ~0u was specified: > > In the initial state, stenciling is disabled, the front and back > stencil reference value are both zero, the front and back stencil > comparison functions are both ALWAYS, and the front and back > stencil mask are both all ones." >
Oh, interesting. I should have looked back into older specs to understand where the ~0u was coming from. Note taken. > > With that, this patch is > > Reviewed-by: Ian Romanick <ian.d.roman...@intel.com> > Great. If you feel like nitpicking, you can check the final commit log here: https://github.com/Igalia/mesa/commit/3784f7b2d5aa739c4abf9aa28874b85bbd1550e5 Thanks a lot! Eduardo _______________________________________________ mesa-dev mailing list mesa-dev@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/mesa-dev