Usage of negative product id's causes the above exception. The cause is the use of the product id's as a mechanism to index into the the in and out block structures.
Specifically on 9.0 it occurs at org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262) It seems reasonable to expect that product id's are positive, if a bit opinionated. I ran across this because the hash function I was using on my product id's includes the negatives in it's range. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html Sent from the Apache Spark User List mailing list archive at Nabble.com.