Hello.

Do you know the rationale behind the (very) small values for
 DEFAULT_RELATIVE_THRESHOLD (set to 100 * Precision.EPSILON)
 DEFAULT_ABSOLUTE_THRESHOLD (set to 100 * Precision.SAFE_MIN)
in "AbstractConvergenceChecker"?
[I created this class as part of a refactoring, but the values were carried
over from whatever class contained that functionality before.]

With those values, the "GaussNewtonOptimizer" fails to find the solution but
if one changes the threshold to either
 1e3 * Precision.EPSILON (for the relative threshold)
or
 1e281 * Precision.SAFE_MIN (for the absolute threshold)
the solution is found in 4 evaluations!

As I wrote on the user ML, the thresholds were too stringent.
Are the current values really suitable as defaults (in other part of the
library, similar defaults are much larger)?
[I even think that there shouldn't be any defaults, so that users are
actually aware that the thresholds are problem-dependent and sometimes even
optimizer-dependent. My preference would thus be to deprecate the default
constructor in all the checker classes.]


Best regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to