https://bugs.kde.org/show_bug.cgi?id=499934

--- Comment #18 from Zamundaaa <xaver.h...@gmail.com> ---
(In reply to TheFeelTrain from comment #16)
> I'll admit I have no experience with HDR on mobile devices or macOS, but
> most HDR TVs worth buying will try to follow the EOTF if you simply set the
> brightness to max.
No, they don't. The vast majority of TVs can't even go above 200 nits, they
can't "follow the EOTF" without making the image terrible.
TVs do a ton of processing, including tone and gamut mapping, dynamically
changing brightness and so on to make the image look good with the limited
capabilities of the display.
And as you noticed yourself, they have *one* brightness setting just like every
other sane system, not multiple for different kinds of content.

> The implication here is that you want to willingly break backwards 
> compatibility, which isn't exactly ideal
There neither is, nor ever has been "backwards" compatibility with Windows
applications.
We make Windows games work as well as is possible without making Linux
applications suffer for it. That line will not be crossed, and that's not up
for debate.

> I'm not going to sit here and say you need to copy what Windows does. But it 
> should at least be intuitive
Users configuring many confusing and differently named numbers in each game
just to work with one of their displays is never going to be intuitive.
The only way to make it intuitive is applications being Wayland native and
supporting the APIs we provide. Or at least using the actually not too
different APIs that Windows provides for this purpose, so that Wine could map
them.

> At the very, *very* least there needs to be some explanation in the settings 
> that anything below 203 will result in the full range of your monitor not 
> being utilized.
> I paid extra for a full HDR1000 monitor with local dimming. I don't want my 
> desktop environment to cut that down to an HDR500 monitor just because I 
> don't want to be blinded when I browse the web or look at a spreadsheet. The 
> problem is *when* and *where* the brightness is happening, not the brightness 
> curve as a whole. I don't mind having the full 1000 nit highlights for things 
> like fire, explosions, lightning, etc.
That's not how HDR works. Setting the reference luminance to 100 nits does not
mean that the maximum brightness gets limited to 500 nits. Setting it to 203
nits does not mean the full brightness range of the monitor gets used, and
setting it to 10 doesn't guarantee it does not get used.

> But now that makes HDR get blown out
If you're talking about videos, that's something we can definitely still
improve. The tone mapping curve we have right now is usable but we can do
better.
If you're talking about games, tone mapping them is more difficult because they
rarely provide correct HDR metadata. You should however be able to configure
them for whatever display settings you're using. If some game has artificial
limitations that prevent that, then you'll have to either turn the brightness
down or turn its HDR setting off.

> Is it possible to "shift" the SDR range along the larger HDR range relative 
> to that SDR brightness setting? Or maybe map SDR content to the HDR range in 
> some way?
No. SDR content gets mapped to the reference luminance, and nothing else is
possible without breaking tons of things.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to