[ 
https://issues.apache.org/jira/browse/IGNITE-9413?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16604377#comment-16604377
 ] 

Alexey Platonov commented on IGNITE-9413:
-----------------------------------------

I think than current task must be resolved later because of:
1) No one GDB in popular libraries has rate optimizer. Often users select 
learning rate as metaparameter by grid-search;
2) Rate optimizer adds new metaparemeters complicating GDB using and tuning. It 
is often easier to specify a rough value for the convergence precision 
parameter, select gradient step size and later increase convergence precision.

In this way this task has low priority and we need more investigations for this 
task.

> [ML] Learning rate optimization for GDB.
> ----------------------------------------
>
>                 Key: IGNITE-9413
>                 URL: https://issues.apache.org/jira/browse/IGNITE-9413
>             Project: Ignite
>          Issue Type: Improvement
>          Components: ml
>            Reporter: Yury Babak
>            Assignee: Alexey Platonov
>            Priority: Major
>             Fix For: 2.7
>
>
> We need to support learning rate optimization while training for MSE-loss and 
> Log-loss



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to