Another question: do you have negative or out-of-range user or product
ids or? -Xiangrui

On Tue, Mar 11, 2014 at 8:00 PM, Debasish Das <debasish.da...@gmail.com> wrote:
> Nope..I did not test implicit feedback yet...will get into more detailed
> debug and generate the testcase hopefully next week...
> On Mar 11, 2014 7:02 PM, "Xiangrui Meng" <men...@gmail.com> wrote:
>
>> Hi Deb, did you use ALS with implicit feedback? -Xiangrui
>>
>> On Mon, Mar 10, 2014 at 1:17 PM, Xiangrui Meng <men...@gmail.com> wrote:
>> > Choosing lambda = 0.1 shouldn't lead to the error you got. This is
>> > probably a bug. Do you mind sharing a small amount of data that can
>> > re-produce the error? -Xiangrui
>> >
>> > On Fri, Mar 7, 2014 at 8:24 AM, Debasish Das <debasish.da...@gmail.com>
>> wrote:
>> >> Hi Xiangrui,
>> >>
>> >> I used lambda = 0.1...It is possible that 2 users ranked in movies in a
>> >> very similar way...
>> >>
>> >> I agree that increasing lambda will solve the problem but you agree
>> this is
>> >> not a solution...lambda should be tuned based on sparsity / other
>> criteria
>> >> and not to make a linearly dependent hessian matrix linearly
>> >> independent...
>> >>
>> >> Thanks.
>> >> Deb
>> >>
>> >>
>> >>
>> >>
>> >>
>> >> On Thu, Mar 6, 2014 at 7:20 PM, Xiangrui Meng <men...@gmail.com> wrote:
>> >>
>> >>> If the matrix is very ill-conditioned, then A^T A becomes numerically
>> >>> rank deficient. However, if you use a reasonably large positive
>> >>> regularization constant (lambda), "A^T A + lambda I" should be still
>> >>> positive definite. What was the regularization constant (lambda) you
>> >>> set? Could you test whether the error still happens when you use a
>> >>> large lambda?
>> >>>
>> >>> Best,
>> >>> Xiangrui
>> >>>
>>

Reply via email to