Dear All,

I'm new to the whole Spark framework, but already fell in love with it
:). For a research project at the University of Zurich I'm trying to
implement a Matrix Centroid Decomposition in Spark. I'm using the Java
API.

My problem occurs when I try to call a JavaPairRDD.reduce:
"""
java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
"""

I read in a forum post, that the issue here might be that I'm using
the maven 0.9.1 version
(http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-td4937.html)

I downloaded the GIT version of spark and compiled with sbt, but I
don't really know how I can force my java project to use that one
instead of the maven version.

Does anyone have an advice on how I could achieve this? I'm using
Intellij as IDE and the project is setup with maven.


Best Regards
-- 
Alessandro De Carli
Sonnmattstr. 121
CH-5242 Birr

Email: decarli....@gmail.com
Twitter: @a_d_c_
Tel: +41 76 305 75 00
Web: http://www.papers.ch

Reply via email to