On Thu, Jan 17, 2013 at 09:04:55PM +0100, Thomas Neidhart wrote:
> On 01/17/2013 08:24 PM, Gilles Sadowski wrote:
> >>> [...]
> >>>
> >>> If we do not have the resources to dig further into the real problem, we 
> >>> can
> >>> circumscribe it by indicating which interval of epsilon values are
> >>> considered trustworthy (and declining all resposibility if user try 
> >>> pushing
> >>> the implementation too far).
> >>
> >> Well, I do disagree.
> >> I do not want to have a situation where users get different results with
> >> each run. I want to have deterministic behavior for deterministic
> >> algorithms.
> > 
> > Do you suggest that the JVM algorithm is not deterministic? ;-)
> 
> The iteration order of HashSet entries is based on the hash code of the
> entries and the number of buckets in the underlying HashMap. As the hash
> code of a LinearConstraint is derived from its values (including an enum
> - Relationship) it may change, and in fact does (the enum has no
> overridden hashCode method, and Java 6+ does not seem to guarantee the
> same hashCode for the same enum every time contrary to what Java 5 seems
> to do).
> 
> So every time you run, your iteration order could be different (and you
> can see it with the test case).

It's fair to add "hashCode" and "equals" methods to class "Relationship",
and if the side-effect disappear as a consequence then good for you!
I'm just saying that it will not solve the real problem (not anymore than
replacing "Set" with "List" would).

> 
> >> Yes, there are numerical stability issues, which in this case can be
> >> treated by relaxing the convergence criteria as the user did.
> > 
> > Then, I propose that lower bound of the convergence criterion be set as a
> > precondition and the algorithm will be deterministic in a more obvious way.
> > 
> >> Otoh, we should have thorough testing where a permutation of the linear
> >> constraints could be a useful addition to detect more such issues. For
> >> such a flexible tool like the SimplexSolver, there will always be
> >> problems where the solution is difficult to find, and will require some
> >> tuning. Even octave (using the very good and mature glpk library) did
> >> not find the optimal solution in this case.
> > 
> > This even further convinces me that we should not change an implementation
> > detail: such details simply must not have visible side-effects.
> > It has a visible effect here because the precondition is not satisfied.
> 
> I do not fully understand, about which precondition do we talk here?

The real protection would be to forbid epsilon values that lead to
instability. Maybe that the lower bound depends on the problem (?).
If so, then maybe that the algorithm can, at some point, perform an
auto-diagnostic and detect that the solution, even if found, may be
fragile. If none of this is possible then just mentions in the Javadoc that
the stability critically depend on the choice of epsilon (up to raising an
exception if set to low, since that will happen if people set the
constraints on the "wrong" order, even with the above fix).


Regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to