Hi Paolo,
The custom classes and jars are distributed across the Spark cluster via an
HTTP server on the master when the absolute path of the application fat jar is
specified in the spark-submit script. The Advanced Dependency Management
section on https://spark.apache.org/docs/latest/submittin
After trying around with few permutations I was able to get rid of this error
by changing the datastax-cassandra-connector version to 1.1.0.alpha. Could it
be related to a bug in alpha4? But why would it manifest itself as a class
loading error?
On 27 Oct 2014, at 12:10, Saket Kumar wrote
Hello all,
I am trying to run a Spark job but while running the job I am getting missing
class errors. First I got a missing class error for Joda DateTime, I added the
dependency in my build.sbt and rebuilt the assembly jar and tried again. This
time I am getting the same error for java.util.Da
Hello all,
I am trying to unit test my classes involved my Spark job. I am trying to
mock out the Spark classes (like SparkContext and Broadcast) so that I can
unit test my classes in isolation. However I have realised that these are
classes instead of traits. My first question is why?
It is quit