Hi bearrito,

This is a known issue
(https://spark-project.atlassian.net/browse/SPARK-1281) and it should
be easy to fix by switching to a hash partitioner.

CC'ed dev list in case someone volunteers to work on it.

Best,
Xiangrui

On Thu, Mar 27, 2014 at 8:38 PM, bearrito <j.barrett.straus...@gmail.com> wrote:
> Usage of negative product id's causes the above exception.
>
> The cause is the use of the product id's as a mechanism to index into the
> the in and out block structures.
>
> Specifically on 9.0 it occurs at
> org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)
>
> It seems reasonable to expect that product id's are positive, if a bit
> opinionated.  I ran across this because the hash function I was using on my
> product id's includes the negatives in it's range.
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to