Hi.
> [...]
> >
> > So I suggest we disconnect differentiation from optimization, but in a
> > way that would let users decide how they provide the differentials. This
> > means I would not like to reintroduce the former interfaces.
> >
> > What about having the optimize() methods taking two arguments function,
> > a MultivariateFunction for the value and a MultivariateVectorFunction
> > for the gradient? It would be up to the user to select how they compute
> > each function. They could use direct coding, they could use a finite
> > differences wrapper around the first function to implement the second,
>
>
>
> > they could use our differentiation framework and put one wrapper to
> > extract the value and another one to extract the gradient.
>
> That would also be a nice example of usage of the differentiation
> framework.
>
> > What do you think?
>
> Yes. Now I got it (I think). IIUC, we could even introduce the new interface
> before 3.1, and deprecated old the older ones (as you wanted to do anyways,
> IIRC).
I can't write anymore. :-/
>
> I'll give it a try in the next days. OK?
Shall we also introduce entirely new packages?
optim
optim.scalar.noderiv
optim.scalar.noderiv.PowellOptimizer
optim.scalar.noderiv.SimplexOptimizer
optim.scalar.noderiv.CMAESOptimizer
optim.scalar.noderiv.BOBYQAOptimizer
optim.scalar.gradient
optim.scalar.gradient.NonLinearConjugateGradientOptimizer
optim.vector
optim.vector.jacobian
optim.vector.jacobian.AbstractLeastSquaresOptimizer
optim.vector.jacobian.LevenbergMarquardtOptimizer
optim.vector.jacobian.GaussNewtonOptimizer
optim.scalar.univariate.noderiv
optim.scalar.univariate.noderiv.BrentOptimizer
Regards,
Gilles
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]