I don't know if it's anything you or the project is missing... that's
just a JDK bug.
FWIW I am on 1.7.0_51 and have not seen anything like that.

I don't think it's a protobuf issue -- you don't crash the JVM with
simple version incompatibilities :)
--
Sean Owen | Director, Data Science | London


On Thu, Apr 17, 2014 at 7:29 PM, Steven Cox <s...@renci.org> wrote:
> So I tried a fix found on the list...
>
>    "The issue was due to meos version mismatch as I am using latest mesos
> 0.17.0, but spark uses 0.13.0.
>     Fixed by updating the SparkBuild.scala to latest version."
>
> I changed this line in SparkBuild.scala
>         "org.apache.mesos"         % "mesos"            % "0.13.0",
> to
>         "org.apache.mesos"         % "mesos"            % "0.18.0",
>
> ...ran make-distribution.sh, repackaged and redeployed the tar.gz to HDFS.
>
> It still core dumps like this:
> https://gist.github.com/stevencox/11002498
>
> In this environment:
>   Ubuntu 13.10
>   Mesos 0.18.0
>   Spark 0.9.1
>   JDK 1.7.0_45
>   Scala 2.10.1
>
> What am I missing?

Reply via email to