Le 16/08/2013 18:55, Ajo Fod a écrit :
> The algorithm computes the Hessian using an update rule. My question was
> what if you can compute the hessian analytically?
> 
> Hessian: http://en.wikipedia.org/wiki/Hessian_matrix
> Gradient: http://en.wikipedia.org/wiki/Gradient

We do have support to help generating second derivatives in the analysis
package, but don't use them yet in the optimizers.

best regards,
Luc

> 
> Cheers,
> -Ajo
> 
> 
> On Fri, Aug 16, 2013 at 9:39 AM, Luc Maisonobe <luc.maison...@free.fr>wrote:
> 
>> Le 15/08/2013 22:59, Ajo Fod a écrit :
>>> Hello,
>>>
>>> Is'nt there an advantage to being able to compute the Jacobian of the
>>> gradient precisely at a point?
>>>
>>> If so, is there a class that uses the Jacobian instead of estimating the
>>> jacobian from the last few iteration as
>> NonLinearConjugateGradientOptimizer
>>> does?
>>
>> I'm not sure what you really mean, but you can always pass an
>> ObjectiveFunctionGradient holding any MultivariateVectorFunction to be
>> used by the algorithm.
>>
>> Luc
>>
>>>
>>> Thanks,
>>> -Ajo
>>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>> For additional commands, e-mail: dev-h...@commons.apache.org
>>
>>
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to