On Friday, 20 March 2026 15:32:37 Central European Standard Time Michel Dänzer 
wrote:
> On 3/19/26 13:28, Nicolas Frattaroli wrote:
> > This series adds a new "link bpc" DRM property. It reflects the display
> > link's actual achieved output bits per component, considering any
> > degradation of the bit depth done by drivers for bandwidth or other
> > reasons. The property's value is updated during an atomic commit, which
> > is also when it fires an uevent if it changed to let userspace know.
> > 
> > There's a weston implementation at [1] which makes use of this new
> > property to warn when a user's requested bpc could not be reached.
> > 
> > [1]: https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1850
> 
> I see no description of a real-world use case, either in this series
> or in the weston MR, beyond logging a message when the "link bpc" &
> "max bpc" property values don't match. They are not expected to match
> in general, so I have a hard time seeing the usefulness of that.

Hello,

these are valid concerns. The problem being addressed is related to
userspace being able to detect whether the link has degraded due to,
say, a sketchy cable.

This patch started out as a method of forcing the output link's BPC
value to a certain value, but this is not desirable. The max bpc
property is already used to restrict the link's bpc due to sketchy
hardware that advertises a higher max bpc than it can actually
achieve.

This adds the other side of the equation, where userspace isn't
necessarily keen on blindly accepting the combination of output
link parameters the kernel degraded to. This allows userspace to
detect that an explicitly chosen value it tried did not work, and
try again with a different color format/VRR/bpc/etc.

A particular real-world use case is for playback of video content.
When playing back YUV 4:2:0 10-bit video content in a full-screen
setting, having RGB 10-bit degrade to YUV 4:2:0 10-bit rather than
RGB 8-bit is more desirable. However, this is a tradeoff only
userspace knows to make; the kernel doesn't necessarily know that
the framebuffer it has been handed as RGB 10-bit is secretly just
a video player's playback of YUV 4:2:0 10-bit content. As for
the property that let's userspace actually set the output color
format, that's a separate series of mine.

I agree that the weston implementation isn't a great showcase,
but it's actually supposed to compare link bpc with an explicitly
set max bpc config value, not the property value. The config value
exists to request a certain bpc.

> Moreover, there's no description of what exactly the "link bpc" property
> value means, e.g. vs things like DSC or dithering, or how a compositor / 
> user would determine which value they need / want under given circumstances.

I agree that I should've expanded on this after splitting it out of the
HDMI patch. It's the output BPC as HDMI understands it. That means DSC is not
a factor. I don't know if any display protocols do dithering at the
protocol level, I only know some monitors dither internally, which isn't
something that can be detected.

> In summary, I'm skeptical that this will be useful in practice in the
> current form. I do see potential for spurious bug reports based on the
> "link bpc" property having the "wrong" value though.

Kind regards,
Nicolas Frattaroli


Reply via email to