On 18/04/2024 14:29, Roman Arzumanyan wrote:
Hi Diego,
Asking for my own education.
As far as you've explained, the 8 > 10 bit conversion happens within the
driver, that's understandable.
But how does it influence the output? Does it perform some sort of
proprietary SDR > HDR conversion under the hood that maps the ranges?
What's gonna be the user observable difference between these 2 scenarios?
1) 8 bit input > HEVC 8 bit profile > 8 bit HEVC output
2) 8 bit input > 10 bit up conversion > HEVC 10 bit profile > 10 bit
HEVC output
Better visual quality? Smaller compressed file size?
In other words, what's the purpose of this feature except enabling new
Video Codec SDK capability?
Video Codecs tend to be more efficient with 10 bit, even if it's just 8
bit content that's been up-converted to 10 bit.
I.e. yes, it'll (Or can, at least. Not sure if it's a given.) produce
smaller/higher quality content for the same input.
As for the exact reason, I can't explain, but it's a well known concept.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".