Hi Jo,

On 18-11-26 03 h 34, johannes hanika wrote:
> heya,
> On Fri, Nov 23, 2018 at 5:24 PM Aurélien Pierre
> <rese...@aurelienpierre.com> wrote:
>> Hi everyone,
>>
>> my darktable is installed on Ubuntu Budgie (fork of Gnome 3), but it was the 
>> same when I used Gnome Shell.  I have a custom screen ICC profile installed 
>> in gnome-color-manager, and loaded in darktable through colord.
>>
>> When I change the ICC profile on Gnome with darktable open, the colors of 
>> the darkroom preview change too (no matter if darktable uses the system 
>> display profile or one built-in profile, like Adobe RGB). That is the 
>> contrast and white point of the picture, plus the color of the UI.
> what do you mean "if darktable uses"? in which setting? if you set the
> output colour profile in the darkroom module, it'll change the export
> output, not the display profile for darkroom mode.
I'm talking of the display profile, set at the bottom of the lighttable
and darkroom.
> the display profile we get from the system, via interaction with
> colord/xiccd or by querying the xatom variable.
>
>> So that means that the OS is stacking another color transformation on top of 
>> darktable's one.
> yes, but what you observe above is that we query the icc profile from
> the system and apply it as display profile. the os is stacking the
> video luts (VCGT as you noted below). these are loaded to the GPU as
> part of your system's colour management setup.
The VCGT are fine, but what makes me wonder is the white balance change
that comes with it it when I switch between D65 and D55 profiles.  I
expect the VCGT to embed gamma curves, but no white point correction.
So, as far as I understand, that could mean the TRC is applied by
darktable and once again by the system.
>> From this article (2011), I get that gnome expects apps to take care of 
>> themselves :
>>
>> One of the things I tried to deliberately ignore designing colord was 
>> actually flipping pixel values. Colord is a very high level daemon that can 
>> say to apps like Krita or GIMP “Use this profile for this device” rather 
>> than them supplying a 140Mb memory buffer and an operation list. This means 
>> we can do the conversion on the CPU using lcms2 for some programs and using 
>> the GPU using a shader in things that use 3D. By not trying to wrap lcms we 
>> can let the application do the pixel conversion in the right layer in the 
>> right way.
> yes, that is correct.
>
>
>> Of course, the downside of this is that you have to patch applications to 
>> actually do the right thing. We can make this easier by doing some framework 
>> code for Clutter and Cairo, but sooner or later the application has to know 
>> about color management in one form or another. This is going to be my main 
>> focus for GNOME 3.4; now we have all the framework installed and working and 
>> we can say to application authors “It's already installed, so don't worry 
>> about the additional dependency, just commit my patch and it'll just work”.
>>
>> But gnome-color-manager has no documentation, and even the Gnome color dev 
>> documentation is pretty useless (a lot of "how to", no "what's going on", 
>> but they found time to design a cheesy kindergarten theme).
>>
>> Looking at GDK pixbuf doc, they don't have tags to explicitely say "hey 
>> that's already color-corrected so bug off". The Wikipedia entry of Linux 
>> color management is as helpful and factual as a marketing director 
>> motivational speech (let's increase the leverage of color management by 
>> ensuring the quality of good devices, with a pro-active method to supervise 
>> critical elements in a proficent way — sure !).
>>
>> As of now, I have seen no block diagram to describe the full color pipe in 
>> Linux, nor any way to ensure the quality of the transform.
>>
>> From the info I have gathered, the pipe I have put together is as follow:
>>
>> || darktable pipe -> LCMS/(Internal cmatrix color correction + TRC) -> Cairo 
>> surface -> GDK pixbuff -> || -> Mutter compositor -> (OS color correction ? 
>> TRC ?) -> Xorg -> Nvidia/Intel GPU driver -> (Color correction ? VCGT ?) -> 
>> || -> HDMI DAC (gamma 2.2) -> Screen
>>
>> So my question is : does anyone have any idea of what's going on with color 
>> on Linux, or are we stacking ICC on top of shit just to pretend it's 
>> color-managed magically, somehow ?
> that graph seems to make sense to me. we do apply our custom cmatrix +
> TRC in our own code because traditionally it was some faster than
> using lcms2. also there was the issue of clipping at [0,1] (which was
> resolved in lcms2 as well at some point). in addition to that there's
> the VCGT which has to be setup. also note that we don't make up the
> cmatrix ourselves but it is supplied to us by the desktop. i'd like to
> stress that icc is really just a storage format: it's some blob
> encoding from which we grab a matrix and shaper curve.
>
> HDMI still uses a hardcoded gamma of 2.2? i didn't know that.
I have been told that, I have no other proof.
> and yes, it takes some diligence to setup all the components that are
> involved in the pipeline. we have "darktable-cmstest" to perform a few
> high level sanity checks because of that. also be aware that some
> monitors supply edid profiles which might make it into colord but are
> in fact completely bogus. gnome colour manager is maintaining a list
> of such profiles if i remember correctly.

Oh God, that's a lot of fun ahead.

Thanks for your answer,

Aurélien.

>
> cheers,
>  jo
>
>
>> Thanks,
>>
>> Aurélien.
>>
>>
>> ___________________________________________________________________________ 
>> darktable developer mailing list to unsubscribe send a mail to 
>> darktable-dev+unsubscr...@lists.darktable.org
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org

Reply via email to