hi,

i was playing with noise reduction presets again and tried the large
neighbourhood search window. on my shots i could very rarely spot a
difference at all increasing K above 7, and even less so going above
10. the image you posted earlier did show quite a substantial
improvement however. i was wondering whether you'd be able to share
the image so i can evaluate on it? maybe i just haven't found the
right test image yet, or maybe it's camera dependent?

(and yes, automatic and adaptive would be better but if we can ship a
simple slider that can improve matters, maybe we should)

cheers,
 jo



On Mon, Jan 29, 2018 at 2:05 AM, rawfiner <rawfi...@gmail.com> wrote:
> Hi
>
> Yes, the patch size is set to 1 from the GUI, so it is not a bilateral
> filter, and I guess it corresponds to a patch window size of 3x3 in the
> code.
> The runtime difference is near the expected quadratic slowdown:
> 1,460 secs (8,379 CPU) for 7 and 12,794 secs (85,972 CPU) for 25, which
> means about 10.26x slowdown
>
> If you want to make your mind on it, I have pushed a branch here that
> integrates the K parameter in the GUI:
> https://github.com/rawfiner/darktable.git
> The branch is denoise-profile-GUI-K
>
> I think that it may be worth to see if an automated approach for the choice
> of K may work, in order not to integrate the parameter in the GUI.
> I may try to implement the approach of Kervann and Boulanger (the reference
> from the darktable blog post) to see how it performs.
>
> cheers,
> rawfiner
>
>
> 2018-01-27 13:50 GMT+01:00 johannes hanika <hana...@gmail.com>:
>>
>> heya,
>>
>> thanks for the reference! interesting interpretation how the blotches
>> form. not sure i'm entirely convinced by that argument.
>> your image does look convincing though. let me get this right.. you
>> ran with radius 1 which means patch window size 3x3? not 1x1 which
>> would be a bilateral filter effectively?
>>
>> also what was the run time difference? is it near the expected
>> quadratic slowdown from 7 (i.e. 15x15) to 25 (51x51) so about 11.56x
>> slower with the large window size? (test with darktable -d perf)
>>
>> since nlmeans isn't the fastest thing, even with this coalesced way of
>> implementing it, we should certainly keep an eye on this.
>>
>> that being said if we can often times get much better results we
>> should totally expose this in the gui, maybe with a big warning that
>> it really severely impacts speed.
>>
>> cheers,
>>  jo
>>
>> On Sat, Jan 27, 2018 at 7:34 AM, rawfiner <rawfi...@gmail.com> wrote:
>> > Thank you for your answer
>> > I perfectly agree with the fact that the GUI should not become
>> > overcomplicated.
>> >
>> > As far as I understand, the pixels within a small zone may suffer from
>> > correlated noise, and there is a risk of noise to noise matching.
>> > That's why this paper suggest not to take pixels that are too close to
>> > the
>> > zone we are correcting, but to take them a little farther (see the
>> > caption
>> > of Figure 2 for a quick explaination):
>> >
>> >
>> > https://pdfs.semanticscholar.org/c458/71830cf535ebe6c2b7656f6a205033761fc0.pdf
>> > (in case you ask, unfortunately there is a patent associated with this
>> > approach, so we cannot implement it)
>> >
>> > Increasing the neighborhood parameter results in having proportionally
>> > less
>> > problem of correlation between surrounding pixels, and decreases the
>> > size of
>> > the visible spots.
>> > See for example the two attached pictures: one with size 1, force 1, and
>> > K 7
>> > and the other with size 1, force 1, and K 25.
>> >
>> > I think that the best would probably be to adapt K automatically, in
>> > order
>> > not to affect the GUI, and as we may have different levels of noise in
>> > different parts of an image.
>> > In this post
>> > (https://www.darktable.org/2012/12/profiling-sensor-and-photon-noise/),
>> > this
>> > paper is cited:
>> >
>> > [4] charles kervrann and jerome boulanger: optimal spatial adaptation
>> > for
>> > patch-based image denoising. ieee trans. image process. vol. 15, no. 10,
>> > 2006
>> >
>> > As far as I understand, it gives a way to choose an adaptated window
>> > size
>> > for each pixel, but I don't see in the code anything related to that
>> >
>> > Maybe is this paper related to the TODOs in the code ?
>> >
>> > Was it planned to implement such a variable window approach ?
>> >
>> > Or if it is already implemented, could you point me where ?
>> >
>> > Thank you
>> >
>> > rawfiner
>> >
>> >
>> >
>> >
>> > 2018-01-26 9:05 GMT+01:00 johannes hanika <hana...@gmail.com>:
>> >>
>> >> hi,
>> >>
>> >> if you want, absolutely do play around with K. in my tests it did not
>> >> lead to any better denoising. to my surprise a larger K often led to
>> >> worse results (for some reason often the relevance of discovered
>> >> patches decreases with distance from the current point). that's why K
>> >> is not exposed in the gui, no need for another irrelevant and cryptic
>> >> parameter. if you find a compelling case where this indeed leads to
>> >> better denoising we could rethink that.
>> >>
>> >> in general NLM is a 0-th order denoising scheme, meaning the prior is
>> >> piecewise constant (you claim the pixels you find are trying to
>> >> express /the same/ mean, so you average them). if you let that
>> >> algorithm do what it would really like to, it'll create unpleasant
>> >> blotches of constant areas. so for best results we need to tone it
>> >> down one way or another.
>> >>
>> >> cheers,
>> >>  jo
>> >>
>> >>
>> >>
>> >> On Fri, Jan 26, 2018 at 7:36 AM, rawfiner <rawfi...@gmail.com> wrote:
>> >> > Hi
>> >> >
>> >> > I am surprised to see that we cannot control the neighborhood
>> >> > parameter
>> >> > for
>> >> > the NLM algorithm (neither for the denoise non local mean, nor for
>> >> > the
>> >> > denoise profiled) from the GUI.
>> >> > I see in the code (denoiseprofile.c) this TODO that I don't
>> >> > understand:
>> >> > "//
>> >> > TODO: fixed K to use adaptive size trading variance and bias!"
>> >> > And just some lines after that: "// TODO: adaptive K tests here!"
>> >> > (K is the neighborhood parameter of the NLM algorithm).
>> >> >
>> >> > In practice, I think that being able to change the neighborhood
>> >> > parameter
>> >> > allows to have a better noise reduction for one image.
>> >> > For  example, choosing a bigger K allows to reduce the spotted aspect
>> >> > that
>> >> > one can get on high ISO images.
>> >> >
>> >> > Of course, increasing K increase computational time, but I think we
>> >> > could
>> >> > find an acceptable range that would still be useful.
>> >> >
>> >> >
>> >> > Is there any reason for not letting the user control the neighborhood
>> >> > parameter in the GUI ?
>> >> > Also, do you understand the TODOs ?
>> >> > I feel that we would probably get better denoising by fixing these,
>> >> > but
>> >> > I
>> >> > don't understand them.
>> >> >
>> >> > I can spend some time on these TODOs, or to add the K parameter to
>> >> > the
>> >> > interface if you think it is worth it (I think so but it is only my
>> >> > personal
>> >> > opinion), but I have to understand what the TODOs mean before
>> >> >
>> >> > Thank you for your help
>> >> >
>> >> > rawfiner
>> >> >
>> >> >
>> >> >
>> >> > ___________________________________________________________________________
>> >> > darktable developer mailing list to unsubscribe send a mail to
>> >> > darktable-dev+unsubscr...@lists.darktable.org
>> >>
>> >>
>> >> ___________________________________________________________________________
>> >> darktable developer mailing list
>> >> to unsubscribe send a mail to
>> >> darktable-dev+unsubscr...@lists.darktable.org
>> >>
>> >
>>
>> ___________________________________________________________________________
>> darktable developer mailing list
>> to unsubscribe send a mail to
>> darktable-dev+unsubscr...@lists.darktable.org
>>
>
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org

Reply via email to