Hi all,

I am trying to run spark over a MapR cluster. I successfully ran several
custom applications on a previous non-mapr hadoop cluster but i can't get
them working on the mapr one. To be more specific, i am not able to read or
write on mfs without running into a serialization error from Java. Note that
everything works fine when i am running the app in local mode which make me
think of a dependancy error.

The test application is built using sbt with the following dependency:
 - org.apache.spark spark-core 0.9.1

In my test_app/lib directory i have:
 - hadoop-core-1.0.3-mapr-3.0.2.jar
 - json-20140107.jar
 - maprfs-1.0.3-mapr-3.0.2.jar

Finally, i add those jars with conf.setJars so that they are distributed on
the cluster.

Am I compiling with the wrong dependencies? Should i get a "mapr version" of
spark-core?

Regards, Nelson



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/maprfs-and-spark-libraries-tp6392.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to