I just tested my min / max feature on PToptimizer.exe  and it seems that 
there is some bug. It doesn't work as expected. 
But it did work in earlier versions of "fastptoptimizer". I think that I'll 
be able to fix it soon.

Florian Königstein schrieb am Samstag, 19. Februar 2022 um 10:35:52 UTC+1:

> In my version "fastptoptimizer" of libpano I have the options to define a 
> min and a max limit for each parameter, i.e. the optimizer won't change the 
> parameter beyond the limits. The min / max options define "hard" limits for 
> the parameter.
>
> You can alternatively define "soft" limits by setting a weight for some or 
> all parameters. The sum of squares that the optimizer tries to minimize is 
> increased by the weighted squared deviation of the actual parameter from 
> its initial value (before optimization), i.e. the deviation is squared and 
> then multiplied by the weight for the parameter. So a deviation of the 
> parameter from its initial value is punished by an increase of the sum of 
> squares. The value 1 / sqrt(weight) (the inverse of the square root of the 
> weight) can be interpreted as a standard deviation (or "soft" limit) for 
> the initial parameter value.
>
> However, I have not yet implemented the "hard" and "soft" limits in my 
> Hugin branch (Hugin++) - at the moment they are only accessible via the 
> command line optimizer in "fastptoptimizer" (PToptimizer.exe). 
> PToptimizer.exe expects the filename of a script file. In this file the 
> "hard" limits can be defined in the 'v' lines as in the following example:
>
> v r0 min-10 max10
> v p0 min-20 max-10
> v y0 min30 max40
> v r1 min-20 max20
> ...
>
> In the following example I define weights for parameters.
> v r0 w4
> v p0 w1
> v y0 w0.25
> v r1 w0.01
>
> E.g. in the line
> v y0 w0.25 
> the weight is 0.25, i.e. the "standard deviation" (soft limit) is 2.
>
> If you have problems optimizing large panoramas, you may also like my 
> "weights for control points" feature that is accessible via Hugin++.
>
> [email protected] schrieb am Sonntag, 13. Februar 2022 um 23:12:45 UTC+1:
>
>> This may or may not be helpful -- but in the SkyFill program I found a 
>> version of levenberg marquardt that will fit with constraints on the 
>> parameters.  I *think* it is derived from the source code hugin has 
>> (lmdir.c), so functionally might be quite close to what it in hugin.  You 
>> can find it at:
>> http://cow.physics.wisc.edu/~craigm/idl/cmpfit.html
>>
>> It's set up so any parameters can have an upper or lower constraint (or 
>> both, or neither).   So the user could request the FOV (if it is a 
>> parameter in the fit) have a final fit value between the upper and lower 
>> constraint.  In the little bit of testing I have done it seems to help 
>> avoid falling into a local minimum for the optimization.
>>
>> Jeff
>>
>> On Sunday, February 13, 2022 at 4:11:27 AM UTC-8 [email protected] 
>> wrote:
>>
>>> I think that problem is important and ought to get some attention and 
>>> can be solved.
>>>
>>> But I think your suggested solution would not be effective.  Either the 
>>> user would need to know enough in setting the limits that they might as 
>>> well just set the value and not optimize the variable, or automatically set 
>>> limits would do more harm than good in other cases.
>>>
>>> Based entirely on use, without yet even looking at the relevant part of 
>>> the code, I'm guessing both sources of a control point are projected into 
>>> their theoretical positions and compared there.  That approach would be 
>>> inherently unstable and lead to finding wrong solutions that have lower 
>>> "error" than the right solution.  That fits the observed behavior, thus my 
>>> guess.
>>>
>>> Projecting one point to its theoretical position and then back to the 
>>> other image (and computing the error there) would be inherently much more 
>>> stable.  But given varying zoom level across the input, that could 
>>> inappropriately weight control points.  Effective zoom level could vary 
>>> within an image due to the original lens distortion as well as varying 
>>> original zoom in the images.  But my guess is that the weight problem could 
>>> be solved more easily than other approaches to fixing the original problem.
>>>
>>> I might be 100% wrong about all of this.  I hope to find time to look at 
>>> that part of the code and really understand why optimize is so likely to 
>>> find wrong answers.  But in working on many other optimize problems in many 
>>> other domains, my experience tells me that stricter limits are likely to be 
>>> a bad cover up for a problem that needs to be fixed elsewhere.
>>>
>>>

-- 
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
--- 
You received this message because you are subscribed to the Google Groups 
"hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/hugin-ptx/cf21c938-91e6-4e27-b8d6-a45467c9683an%40googlegroups.com.

Reply via email to