Hi Andrew,
Thanks for the current doc.
I'd almost gotten to the point where I thought that my custom code needed
> to be included in the SPARK_EXECUTOR_URI but that can't possibly be
> correct. The Spark workers that are launched on Mesos slaves should start
> with the Spark core jars and then t
Here's the 1.0.0rc9 version of the docs:
https://people.apache.org/~pwendell/spark-1.0.0-rc9-docs/running-on-mesos.html
I refreshed them with the goal of steering users more towards prebuilt
packages than relying on compiling from source plus improving overall
formatting and clarity, but not otherw
Hi Tobias,
On Wed, May 21, 2014 at 5:45 PM, Tobias Pfeiffer wrote:
>first, thanks for your explanations regarding the jar files!
No prob :-)
> On Thu, May 22, 2014 at 12:32 AM, Gerard Maas
> wrote:
> > I was discussing it with my fellow Sparkers here and I totally overlooked
> > the fact that
Hi Gerard,
first, thanks for your explanations regarding the jar files!
On Thu, May 22, 2014 at 12:32 AM, Gerard Maas wrote:
> I was discussing it with my fellow Sparkers here and I totally overlooked
> the fact that you need the class files to de-serialize the closures (or
> whatever) on the wo
Hi Tobias,
Regarding my comment on closure serialization:
I was discussing it with my fellow Sparkers here and I totally overlooked
the fact that you need the class files to de-serialize the closures (or
whatever) on the workers, so you always need the jar file delivered to the
workers in order f
Hi Tobias,
For your simple example, I just used sbt package, but for more complex jobs
that have external dependencies, either:
- you should use sbt assembly [1] or mvn shade plugin [2] to build a "fat
jar" (aka jar-with-dependencies)
- or provide a list of jars including your job jar along wit
Gerard,
thanks very much for your investigation! After hours of trial and
error, I am kind of happy to hear it is not just a broken setup on my
side that's causing the error.
Could you explain briefly how you created that simple jar file?
Thanks,
Tobias
On Wed, May 21, 2014 at 9:47 PM, Gerard M
Hi Tobias,
I was curious about this issue and tried to run your example on my local
Mesos. I was able to reproduce your issue using your current config:
[error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task
1.0:4 failed 4 times (most recent failure: Exception failure:
java.lang.