Hi bearrito, this issue was fixed by Tor in
https://github.com/apache/spark/pull/407. You can either try the
master branch or wait for the 1.0 release. -Xiangrui
On Fri, Mar 28, 2014 at 12:19 AM, Xiangrui Meng wrote:
> Hi bearrito,
>
> This is a known issue
> (https://spark-project.atlassian.net/
Hi bearrito,
This is a known issue
(https://spark-project.atlassian.net/browse/SPARK-1281) and it should
be easy to fix by switching to a hash partitioner.
CC'ed dev list in case someone volunteers to work on it.
Best,
Xiangrui
On Thu, Mar 27, 2014 at 8:38 PM, bearrito wrote:
> Usage of negati
Usage of negative product id's causes the above exception.
The cause is the use of the product id's as a mechanism to index into the
the in and out block structures.
Specifically on 9.0 it occurs at
org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$make