Hello Alex,
Ichecked the architecture of MultivariateOptimizer family to compareto what I 
have done so far. I think that what I have done can berefactored to fit in the 
general framework as extension ofGradientMultivariateOptimizer even though, 
main differences are :   
   -    
There is no need to provide explicit gradient function, as it can be computed 
with finite difference (secant approximation to partial derivative).
 
   -    
There is a need to compute the Hessian matrix. In the same way, it is computed 
with finite differences.
 
   -    
My approach uses of an extension of MultivariateFunction that have methods to 
provide gradient and hessian. But that is perhaps not a so good idea: it would 
be better to provide generic gradient and hessian classes that are operator 
applied to plain MultivariateFunction.


I also have already written tests to cover at least all nominalcases and a few 
error cases. Now that I know where to put my code, Iwill integrate it in ACM 
clone (with all tests) and try to refactorit until it fit seamlessly in ACM 
classes and concepts.

This may take a while.

Yours truly 
François Laferrière

    Le jeudi 20 octobre 2022 à 20:18:46 UTC+2, Alex Herbert 
<alex.d.herb...@gmail.com> a écrit :  
 
 Hi,

Thanks for the interest in Commons Math.

Currently all the optimisation code is in commons-math-legacy. I think
the gradient based methods would fit in:

org.apache.commons.math4.legacy.optim.nonlinear.scalar.gradient

Can your implementations be adapted to work with the existing
interfaces? The decision to move the entire 'optim' package to a new
module allows a redesign of interfaces. The old and new can coexist
but ideally we would want to support only one optimisation
architecture. Have a look at the current classes and let us know what
you think.

Regards,

Alex



On Thu, 20 Oct 2022 at 15:36, François Laferrière
<francoislaferrie...@yahoo.fr.invalid> wrote:
>
> Hello,
> Sorry, previous message was a mess....
> Based on Apache common math, I have implemented some commonplace optimization 
> algorithms that could be integrated in ACM. This includes
>
>    - Gradient Descent (en.wikipedia.org/wiki/Gradient_descent)
>
>    - Newton Raphson 
>(https://en.wikipedia.org/wiki/Newton's_method_in_optimization)
>
>    - BFGS 
>(https://en.wikipedia.org/wiki/Broyden–Fletcher–Goldfarb–Shanno_algorithm)
>
> They are implemented in such a way that other algorithms of the same family 
> (Newton) can be implemented easily from existing building blocks.
> I clone http://gitbox.apache.org/repos/asf/commons-math.git but I am a bit 
> lost in the module structure. Should I put my code in one existing 
> commons-math4-* module (if so which one?) or should I create a new module 
> (for instance commons-math-optimization) ?
> Many thanks in advance
> François Laferrière
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

  

Reply via email to