Re: Constraint Solver for Spark

2014-07-07 Thread Xiangrui Meng
Hey Deb, If your goal is to solve the subproblems in ALS, exploring sparsity doesn't give you much benefit because the data is small and dense. Porting either ECOS's or PDCO's implementation but using dense representation should be sufficient. Feel free to open a JIRA and we can move our discussio

Re: Constraint Solver for Spark

2014-07-04 Thread Debasish Das
I looked further and realized that ECOS used a mex file while PDCO is using pure Matlab code. So the out-of-box runtime comparison is not fair. I am trying to generate PDCO C port. Like ECOS, PDCO also makes use of sparse support from Tim Davis. Thanks. Deb

Re: Constraint Solver for Spark

2014-07-03 Thread Debasish Das
Hi Xiangrui, I did some out-of-box comparisons with ECOS and PDCO from SOL. Both of them seems to be running at par but I will do more detailed analysis. I used pdco's testQP randomized problem generation. pdcotestQP(m, n) means m constraints and n variables For ECOS runtime reference here is t

Re: Constraint Solver for Spark

2014-07-02 Thread Xiangrui Meng
Hi Deb, KNITRO and MOSEK are both commercial. If you are looking for open-source ones, you can take a look at PDCO from SOL: http://web.stanford.edu/group/SOL/software/pdco/ Each subproblem is really just a small QP. ADMM is designed for the cases when data is distributively stored or the object

Re: Constraint Solver for Spark

2014-07-01 Thread Debasish Das
Hi Xiangrui, Could you please point to the IPM solver that you have positive results with ? I was planning to compare with CVX, KNITRO from Professor Nocedal and MOSEK probably...I don't have CPLEX license so I won't be able to do that comparison... My experiments so far tells me that ADMM based

Re: Constraint Solver for Spark

2014-06-11 Thread Xiangrui Meng
You idea is close to what implicit feedback does. You can check the paper, which is short and concise. In the ALS setting, all subproblems are independent in each iteration. This is part of the reason why ALS is scalable. If you have some global constraints that make the subproblems no longer decou

Re: Constraint Solver for Spark

2014-06-11 Thread Debasish Das
I got it...ALS formulation is solving the matrix completion problem To convert the problem to matrix factorization or take user feedback (missing entries means the user hate the site ?), we should put 0 to the missing entries (or may be -1)...in that case we have to use computeYtY and accumula

Re: Constraint Solver for Spark

2014-06-11 Thread Xiangrui Meng
For explicit feedback, ALS uses only observed ratings for computation. So XtXs are not the same. -Xiangrui On Tue, Jun 10, 2014 at 8:58 PM, Debasish Das wrote: > Sorry last one went out by mistake: > > Is not for users (0 to numUsers), fullXtX is same ? In the ALS formulation > this is W^TW or H^

Re: Constraint Solver for Spark

2014-06-10 Thread Debasish Das
Sorry last one went out by mistake: Is not for users (0 to numUsers), fullXtX is same ? In the ALS formulation this is W^TW or H^TH which should be same for all the users ? Why we are reading userXtX(index) and adding it to fullXtX in the loop over all numUsers ? // Solve the least-squares proble

Re: Constraint Solver for Spark

2014-06-10 Thread Debasish Das
Hi, I am bit confused wiht the code here: // Solve the least-squares problem for each user and return the new feature vectors Array.range(0, numUsers).map { index => // Compute the full XtX matrix from the lower-triangular part we got above fillFullMatrix(userXtX(index), fullXt

Re: Constraint Solver for Spark

2014-06-06 Thread Debasish Das
Hi Xiangrui, It's not the linear constraint, It is quadratic inequality with slack, first order taylor approximation of off diagonal cross terms and a cyclic coordinate descent, which we think will yield orthogonalityIt's still under works... Also we want to put a L1 constraint as set of line

Re: Constraint Solver for Spark

2014-06-05 Thread Xiangrui Meng
I don't quite understand why putting linear constraints can promote orthogonality. For the interfaces, if the subproblem is determined by Y^T Y and Y^T b for each iteration, then the least squares solver, the non-negative least squares solver, or your convex solver is simply a function (A, b) -> x

Re: Constraint Solver for Spark

2014-06-05 Thread Debasish Das
Hi Xiangrui, For orthogonality properties in the factors we need a constraint solver other than the usuals (l1, upper and lower bounds, l2 etc) The interface of constraint solver is standard and I can add it in mllib optimization But I am not sure how will I call the gpl licensed ipm solver

Re: Constraint Solver for Spark

2014-06-05 Thread Xiangrui Meng
Hi Deb, Why do you want to make those methods public? If you only need to replace the solver for subproblems. You can try to make the solver pluggable. Now it supports least squares and non-negative least squares. You can define an interface for the subproblem solvers and maintain the IPM solver a