On 04/06/2018 06:41 PM, Michel Dänzer wrote:
On 2018-04-06 06:18 PM, Mario Kleiner wrote:
On Fri, Apr 6, 2018 at 12:01 PM, Michel Dänzer <mic...@daenzer.net> wrote:
On 2018-03-27 07:53 PM, Daniel Stone wrote:
On 12 March 2018 at 20:45, Mario Kleiner <mario.kleiner...@gmail.com> wrote:
We need to distinguish if a backing pixmap of a window is
XRGB2101010 or XBGR2101010, as different gpu hw supports
different formats. NVidia hw prefers XBGR, whereas AMD and
Intel are happy with XRGB.

We use the red channel mask of the visual to distinguish at
depth 30, but because we can't easily get the associated
visual of a Pixmap, we use the visual of the x-screens root
window instead as a proxy.

This fixes desktop composition of color depth 30 windows
when the X11 compositor uses EGL.

I have no reason to doubt your testing, so this patch is:
Acked-by: Daniel Stone <dani...@collabora.com>

But it does rather fill me with trepidation, given that X11 Pixmaps
are supposed to be a dumb 'bag of bits', doing nothing else than
providing the same number and size of channels to the actual client
data for the Visual associated with the Window.

As far as X11 is concerned, the number of channels and their sizes don't
even matter; a pixmap is simply a container for an unsigned integer of n
bits (where n is the pixmap depth) per pixel, with no inherent meaning
attached to those values.

That said, I'm not sure this is true for EGL as well. But even if it
isn't, there would have to be another mechanism to determine the format,
e.g. a config associated with the EGL pixmap. The pixmap doesn't even
necessarily have the same depth as the root window, so using the
latter's visual doesn't make much sense.

Hi Michel. I thought with this patch i was implementing what you
proposed earlier as a heuristic on how to get around the "pixmaps
don't have an inherent format, only a depth" problem?

Do you have a pointer to that discussion?


Ok, apologies, i think i was just taking your comment too far as an inspiration. The best i can find in my inbox atm. is this message of yours from 24th November 2017 10:44 AM in a mesa-dev thread "Re: [Mesa-dev] 10-bit Mesa/Gallium support":

"Apologies for the badly formatted followup before, let's try that again:

On 2017-11-23 07:31 PM, Mario Kleiner wrote:
>
> 3. In principle the clean solution for nouveau would be to upgrade the
> ddx to drmAddFB2 ioctl, and use xbgr2101010 scanout to support
> everything back to nv50+, but everything we have in X or Wayland is
> meant for xrgb2101010 not xbgr2101010. And we run into ambiguities of
> what, e.g., a depth 30 pixmap means in some extensions like
> glx_texture_form_pixmap.

A pixmap itself never has a format per se, it's just a container for an
n-bit integer value per pixel (where n is the pixmap depth). A
compositor using GLX_EXT_texture_from_pixmap has to determine the format
from the corresponding window's visual.


--
Earthling Michel Dänzer               |               http://www.amd.com
Libre software enthusiast             |             Mesa and X developer
"

There's nothing in there that suggests my root window solution.
I guess i thought given that we can not get the visual of the window corresponding to the pixmap, let's find some window which is a good enough proxy for onscreen windows with associated depth 30 pixmaps on the same x-screen.


My (possibly inaccurate) understanding is that one can only create a
depth 30 pixmap if the x-screen runs at depth >= 30. It only exposes
depth 30 as supported pixmap format (xdpyinfo) if xorg.conf
DefaultDepth 30 is selected, whereas other depths like
1,4,8,15,16,24,32 are always supported at default depth 24.

That sounds like an X server issue. Just like 32, there's no fundamental
reason a pixmap couldn't have depth 30 despite the screen depth being lower.

Out of curiosity, can you share the output of xdpyinfo with nouveau at
depth 30?


Will have to do that later at the machine. But unless i misremember that as well, xdpyinfo always gives me this, if i run at DefaultDepth 24:

"number of supported pixmap formats:    7
supported pixmap formats:
    depth 1, bits_per_pixel 1, scanline_pad 32
    depth 4, bits_per_pixel 8, scanline_pad 32
    depth 8, bits_per_pixel 8, scanline_pad 32
    depth 15, bits_per_pixel 16, scanline_pad 32
    depth 16, bits_per_pixel 16, scanline_pad 32
    depth 24, bits_per_pixel 32, scanline_pad 32
    depth 32, bits_per_pixel 32, scanline_pad 32
keycode range:    minimum 8, maximum 255
"

At least i don't remember seeing any "depth 30, ..." line ever on any driver+gpu combo if i run X at default depth 24?


Iff depth 30 is selected, then the root window has depth 30, and a depth 30
visual. If each driver only exports one channel ordering for depth 30,
then the channel ordering of any pixmaps associated drawable should be
the same as the one of the root window.

Repeat after me: "X11 pixmaps don't have a format." :) They're just bags
of bits.


I understand that :). Unfortunately it doesn't solve our problem that the pixmaps associated window does have some format which we need to know to sample correctly from the buffer.


Does __DRI_IMAGE_FORMAT_ARGB8888 work for depth 30 as well, by any chance?



I'd have to try later. But at least using __DRI_IMAGE_FORMAT_XRGB2101010 on nouveau, ie. the wrong channel ordering, gives swapped red-blue color channels, so i wouldn't expect non-funky results for ARGB8888.

The basic problem with EGL based compositing is that for eglCreateImageKHR() all we have is the EGLDisplay and EGLContext used for importing an image resource. That, and the handle of the pixmap to import. I couldn't find a xcb protocol request that would allow to find the associated window of a pixmap, or anything else associated which does have a visual, so i don't know how we could find out the proper format without some trickery or heuristic from the driver independent code?

-mario
_______________________________________________
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/mesa-dev

Reply via email to