https://bugs.kde.org/show_bug.cgi?id=499934

--- Comment #16 from TheFeelTrain <thefeeltr...@thefeeltrain.com> ---
(In reply to Zamundaaa from comment #14)

> That's not how that works. Neither on Android, nor iOS, nor MacOS, nor on
> any TV, nor on Windows laptops.

I'll admit I have no experience with HDR on mobile devices or macOS, but most
HDR TVs worth buying will try to follow the EOTF if you simply set the
brightness to max. It's not blinding because you're sitting more than 1 meter
away from the screen, and most of the time spent on a TV isn't spent looking at
UI elements like on a monitor anyway. There's also a lot of TVs will run the UI
at one brightness and then jump to max brightness once you're actually watching
HDR content. Some TVs and monitors will also lock you out of controlling the
brightness in HDR mode entirely.

And I don't think bringing iOS and Android into the conversation is even
relevant. Nobody cares about the brightness curves of their phone screen.
Anyone who wants a proper HDR viewing experience is not using their phone to
watch movies in the first place.

> Only Windows pretend that it's the case on desktop monitors, but that's not
> intentional but instead a serious design flaw that's kept for backwards
> compatibility. Even there it's not really true because monitors do their own
> things with the image.

The implication here is that you want to willingly break backwards
compatibility, which isn't exactly ideal. Even if you disagree with how desktop
HDR currently works, now you're making it even more confusing to the users who
are already used to it. This is a classic case of "the current standard sucks,
let's make a new better standard" and now there's two competing standards. 

I'm not going to sit here and say you need to copy what Windows does. But it
should at least be intuitive. Like I said before, nobody was confused how it
worked prior to 6.3. Now you have a lot of people who were confused enough to
post about it. Even if the implementation was more complicated, nonsense,
whatever-- it resulted in a smoother, less confusing user experience.

At the very, *very* least there needs to be some explanation in the settings
that anything below 203 will result in the full range of your monitor not being
utilized. The worst part of the change is how it's done without the user's
knowledge.

> Just set up the brightness to be what you're comfortable with. All content,
> HDR or not, will be adjusted to match, and there's nothing more to it than
> that.

There is more to it than that. It is not as simple as having one brightness
that applies in every scenario.

I paid extra for a full HDR1000 monitor with local dimming. I don't want my
desktop environment to cut that down to an HDR500 monitor just because I don't
want to be blinded when I browse the web or look at a spreadsheet. The problem
is *when* and *where* the brightness is happening, not the brightness curve as
a whole. I don't mind having the full 1000 nit highlights for things like fire,
explosions, lightning, etc. I don't understand why you think the comfort level
for desktop use is a 1:1 correlation with comfort level for watching a movie or
playing a game.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to