[ 
https://issues.apache.org/jira/browse/FLINK-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14532398#comment-14532398
 ] 

ASF GitHub Bot commented on FLINK-1979:
---------------------------------------

GitHub user jojo19893 opened a pull request:

    https://github.com/apache/flink/pull/656

    Lossfunctions

    We added Logistic Loss Functions and Hinge Loss to the Optimazation 
Framework.
    See for the implemented Functions: 
    https://github.com/JohnLangford/vowpal_wabbit/wiki/Loss-functions
    
    Jira Issue: 
    https://issues.apache.org/jira/browse/FLINK-1979

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jojo19893/flink master

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/656.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #656
    
----
commit 4431e1d2ed0ebfd230ae997bcbf412c965108034
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-04-21T08:59:34Z

    [FLINK-1807] [ml] Adds optimization framework and SGD solver.
    
    Added Stochastic Gradient Descent initial version and some tests.
    
    Added L1, L2 regularization.
    
    Added tests for regularization, fixed parameter setting.

commit e51c63583514d51f727816944df01a0e6e8461eb
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-04-27T14:16:18Z

    Added documentation, some minor fixes.

commit 4a2235c4606b03fbee8854dd02dea7faa27ace9b
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-04-28T13:43:17Z

    Added license to doc file

commit afb281c0273af944fabdca96d833d95a094e7944
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-04-29T12:36:12Z

    Style fixes

commit 9a810b70f55fc62164d873678437f995b44f4d8e
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-05-04T08:50:49Z

    Refactored the way regularization is applied.
    
    We are now using pattern matching to determine if a regularization type is
    differentiable or not. If it is (L2) we apply the regularization at the 
gradient
    calculation step, before taking the update step. If it isn't (L1) the 
regularization
    is applied after the gradient descent step has been taken. This sets us up 
nicely
    for the L-BFGS algorithm, where we can calculate the regularized loss and 
gradient
    required if we are using L2.

commit 9c71e1a18f4011fdaec53945308c230ab6a97752
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-05-05T08:50:52Z

    Added option to provide UDF for the prediction function, moved SGD 
regularization to update step.
    
    Incorporated the rest of Till's comments.

commit 3a0ef8588290c17e83bc0ffa86f1e54d10bf39e0
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-05-05T12:16:50Z

    Style hotfix

commit 8314594d547557886630f3076c8f0a72bb478fac
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-05-05T12:35:53Z

    Regularization test check fix

commit b8ec680d7833669e19f46c6c69f29b76b82d18f5
Author: Theodore Vasiloudis <t...@sics.se>
Date:   2015-05-06T14:34:08Z

    Added prediction function class to alow non-linear optimization in the 
future.
    
    Small refactoring to allow calculation of regularized loss separatly from
    regularized gradient.

commit 63115fbeff237642e1be87f143cb5042a4aeeff7
Author: Johannes Günther <jguenth1>
Date:   2015-05-07T09:46:24Z

    Implemented Hinge Loss

commit aca48e33b4cb9c90006a77caddc5fa8f8c057217
Author: mguldner <mathieuguldner....@gmail.com>
Date:   2015-05-07T09:49:12Z

    Add LogisticLoss Function

----


> Implement Loss Functions
> ------------------------
>
>                 Key: FLINK-1979
>                 URL: https://issues.apache.org/jira/browse/FLINK-1979
>             Project: Flink
>          Issue Type: Improvement
>          Components: Machine Learning Library
>            Reporter: Johannes Günther
>            Assignee: Johannes Günther
>            Priority: Minor
>              Labels: ML
>
> For convex optimization problems, optimizer methods like SGD rely on a 
> pluggable implementation of a loss function and its first derivative.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to