On 08/27/2013 12:35 PM, Konstantin Berlin wrote:
> Sorry, I think there was a misunderstanding. I was suggesting, like
> you are pointing out, that you make a newton step function explicit
> inside the optimizer. I think it is a good idea that this function is
> explicit in all optimizers, so it can later be overwritten.At the
> least, it should be propagated back into all the base abstract method,
> not sure about interfaces. If this is not done, it would fairly
> difficult to extends these optimizers to use bound constraints, etc.
> This is a small step that can make a large difference.

I guess so. :) I like the explicit function idea. There is already an
enum (GaussNewtonOptimizer.Decomposition) to select the decomposition
algorithm. How about you try switching the method

DecompositionSolver getSolver(final RealMatrix matrix)

to something like

RealVector newtonStep(RealMatrix jacobian, RealVector residuals)

if it works well there we can think about adding other methods, such as
Cholesky and SVD.

>
> Thanks,
> Konstantin
>
> P.S. I also think that the start point should be a parameter into the
> optimize function rather than a property of the least squares problem.
> Since depending on the specific optimizer it might not be needed (ex.
> some implementations of quadratic programming). It would also make the
> notation easier to do multiple starts.
>

My thoughts for multiple starts would be to require a builder:

List<Optimum> multiStart(LSOptimizer optimizer, LSBuilder builder,
List<RealVector> starts){
    for(RealVector start : starts){
        ret.add(optimizer.optimize(builder.start(start).build()));
    }
}


or to use the composition pattern that is used already for applying
weights and counting evaluations in LSFactory.

Regards,
Evan


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to