> > I hope you didn't enable any FOV optimization. I certainly think that is > fundamentally broken in hugin and isn't safe to mix into any kind of > controlled test. >
No, I didn't optimize FOV. > >> And in ill-conditioned mathematical problems like here small differences >> in roundoff errors can lead to totally different results. >> > I'm very familiar with that in other optimization problems. I'm still not > ready to believe that is what is happening here. > One way to see whether it's really ill-conditioned in this case is the following: In the file levmar.c are several calls of splm_SuiteSparseQR(), there the QR decomposition of the jacobian 'fjac' matrix is performed. In the calculated upper triangular matrix R you can look at the diagonal elements: They are the Eigenvalues of the matrix R, that are equal to the square roots of the Eigenvalues of the Gramian matrix (transpose(fjac) . fjac). The condition number is the ratio of the largest to the smallest diagonal element. Maybe I will make a test version of PTOptimizer.exe that prints out the condition number of the jacobian matrix. Then you can compare the condition number for that case that only yaw, pitch and roll are optimized and both rotations and translations are optimized. > > Before I pay too much attention to really understanding what they mean, I > want to figure out how to get the code displayed on the (details pull down > of the) pop up that gives summary stats for the second strategy. I'd also > like a way to keep the summary stats from the first strategy around long > enough to read it. > >> >> Without having checked it, I don't believe that B is met. I believe that >> ftol is reached (return code 1). >> > > How do you know the return code? > > If you want to see the return code you could do the following: First go to the "optimize" tab in Hugin and click on "generate script before optimizing". Copy the script into a text file and store it to disk (with the ending .pto). This file can be read by PTOptimizer.exe . Since the latter is a pure console program it's easier to print out some values or run it in debug mode. E.g. you could just insert a printf(...) after the call of lmdif_sparse(). >> I don't know exactly your idea about computing the partial derivatives, >> but I think fastPTOptimizer does it quite well. >> Before I used the function splm_intern_fdif_jac() (see it's description >> in levmar.c), but then I used another method inside adjust.c that is at >> least as fast as splm_intern_fdif_jac(). >> > > Both of those are finite difference methods. I read through your code > trying to understand why it might be better than having the levmar code do > it (why it was worth writing that complicated code). I can see a few ways > that accidental extra work is avoided in your code (by knowing the > relationship between images and CPs) that would be harder to avoid in more > general code. I didn't understand the version you're not using well enough > to understand if it does some of that extra work. > In fact I used first the function splm_intern_fdif_jac(). Then I programmed analytical calculation of the derivatives via automatic differentiation. For that I couldn't use splm_intern_fdif_jac() of course. I had to write a function that calculates only the derivatives that are necessary (in the function calculateJacobian()). Then I saw that there's no advantage using autodiff. But I kept using calculateJacobian() with finite differences. It seems that the optimization runs about two times faster if you use calculateJacobian() instead of splm_intern_fdif_jac(). -- A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ --- You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/hugin-ptx/ae6ac826-66a2-43a7-b29c-bfe52cdd10c1n%40googlegroups.com.
