Le 08/09/2011 09:15, Phil Steitz a écrit :
On 9/7/11 6:34 AM, Gilles Sadowski wrote:
Hi.
In class "AbstractLeastSquaresOptimizer" (in "o.a.c.m.optimization.general"),
the method "getCovariances()" uses "LUDecompositionImpl" to compute the
inverse of a matrix.
In my application, this leads to a "SingularMatrixException". If I change
"LUDecompositionImpl" to "QRDecompositionImpl", no exception is raised.
Also, keeping "LUDecompositionImpl" but passing a much lower singularity
threshold, does not raise the exception either.
Thus, I wonder whether there was a reason for using "LU", and if not,
whether I could change the decomposition solver to "QR" (as this is a
cleaner solution than guessing a good value for the threshold).
There are no reason for LU decomposition, and QR decomposition is
known to be more stable. So I would also consider switching to this
algorithm is a cleaner solution.
Fine. I'll open a JIRA issue.
A unit test "testNonInvertible" in "LevenbergMarquardtOptimizerTest" fails
with the change to "QRDecomposition" because no "SingularMatrixException"
is raised anymore.
What was the purpose of that test?
The purpose was to check that impossible problems were detected properly.
My question should have been clearer: Was the previous behaviour correct
(i.e. an exception *must* be raised somehow)?
Yes. The getCovariances method is trying to invert a singular
matrix. SingularMatrixException is appropriate there. The problem
is that the QR decomp method has no threshold defined for
singularity, doing an exact check against 0 on the elements of
rDiag. Many (most?) singular matrices or
near-singular-enough-to-lead-to-meaningless-results matrices will
not lead to exact 0's there due to rounding in the computations.
That's why the LU impl defines a threshold. For the (exactly)
singular matrix in the test case above, LU does not get exact 0's
either on pivots, but the default singularity threshold (correctly)
detects the singularity. IMO, QR should do the same thing;
otherwise we will happily return meaningless results, as in this
case. In this case, there is no question, since the matrix is
exactly singular. In cases where matrices are near-singular, we are
doing users a favor by throwing, since inversion results are going
to be unstable.
A possibly more robust option here is to use Cholesky decomposition,
which is known to be stable for symmetric positive definite
matrices, which the covariance matrix being inverted here should
be. The exceptions thrown will be different; but they will give
more specific information about what is wrong with the covariance
matrix.
Luc - are there other reasons that QR would be better for cov
matrices? I would have to play with a bunch of examples, but I
suspect with the defaults, Cholesky may do the best job detecting
singular problems.
I'm not sure about Cholesky, but I have always thought that at least QR
was better than LU for near singular matrices, with only a factor 2
overhead in number of operations (but number of operations is not the
main bottleneck in modern computers, cache behavior is more important).
Luc
Phil
The switch to "QR" seems to imply that a previously impossible problem is
now quite possible. So, is the problem really impossible or was the test
actually testing a fragile implementation of "getCovariances()"?
Regards,
Gilles
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org