[ 
https://issues.apache.org/jira/browse/FLINK-2014?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15484612#comment-15484612
 ] 

Victor Chen commented on FLINK-2014:
------------------------------------

To do LASSO, I'm thinking of implementing Greedy Parallel Coordinate Descent 
(http://www.caam.rice.edu/~optimization/disparse/parallel_and_distributed_sparse_optimization.pdf).

Similar to PR#1102 (https://github.com/apache/flink/pull/1102), I'm interested 
in performing training with SSP. Would it be too risky to base my alg on SSP 
(since the PR hasn't been merged yet)? Are there any "primitives" in Flink I 
could use to do non-SSP training (e.g. sync/async training)? I'm planning to 
get a minimum viable "product" by the end of this week.

Also, I know Theodore is working on FLINK-2013 (Create GLM framework). Should I 
extend the GLM interface from your branch? SInce I plan to finish by the end of 
this week and I don't know how much time it'll take, I propose to code a 
LASSO-specific algorithm, then within 2 weeks or so, generalize to the GLM 
framework.

> Add LASSO regression
> --------------------
>
>                 Key: FLINK-2014
>                 URL: https://issues.apache.org/jira/browse/FLINK-2014
>             Project: Flink
>          Issue Type: New Feature
>          Components: Machine Learning Library
>            Reporter: Theodore Vasiloudis
>            Assignee: Theodore Vasiloudis
>            Priority: Minor
>
> LASSO is a linear model that uses L1 regularization



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to