hi, dale , yes, that could work. But new problem comes:
--task always lost-
14/07/31 16:46:29 INFO TaskSetManager: Starting task 0.0:1 as TID 20 on
executor 20140731-154806-1694607552-5050-4716-7: bigdata008 (PROCESS_LOCAL)
14/07/31 16:46:29 INFO TaskSetManager: Serialized task 0.0:1
Okay, I finally got this. The project/SparkBuild needed to be set, and only
0.19.0 seems to work (out of 0.14.1, 0.14.2).
"org.apache.mesos" % "mesos"% "0.19.0",
was the one that worked.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nab
I'm having the exact same problem. I tried Mesos 0.19, 0.14.2, 0.14.1,
hadoop 2.3.0, spark 0.9.1.
# SIGSEGV (0xb) at pc=0x7fab70c55c4d, pid=31012, tid=140366980314880
#
# JRE version: 6.0_31-b31
# Java VM: OpenJDK 64-Bit Server VM (23.25-b01 mixed mode linux-amd64
compressed oops)
# Problema
i am using spark 0.9.1 , mesos 0.19.0 and tachyon 0.4.1 , is spark0.9.1
compatiable with mesos0.19.0?
2014-06-17 15:50 GMT+08:00 qingyang li :
> hi, steven, have you resolved this problem? i encounter the same
> problem, too.
>
>
> 2014-04-18 3:48 GMT+08:00 Sean Owen :
>
> Oh dear I read th
hi, steven, have you resolved this problem? i encounter the same
problem, too.
2014-04-18 3:48 GMT+08:00 Sean Owen :
> Oh dear I read this as a build problem. I can build with the latest
> Java 7, including those versions of Spark and Mesos, no problem. I did
> not deploy them.
>
> Mesos does
Oh dear I read this as a build problem. I can build with the latest
Java 7, including those versions of Spark and Mesos, no problem. I did
not deploy them.
Mesos does have some native libraries, so it might well be some kind
of compatibility issue at that level. Anything more in the error log
that
://bugreport.sun.com/bugreport/crash.jsp
#
From: andy petrella [andy.petre...@gmail.com]
Sent: Thursday, April 17, 2014 3:40 PM
To: user@spark.apache.org
Subject: Re: Spark 0.9.1 core dumps on Mesos 0.18.0
If you can test it quickly, an option would be to try with the
; # If you would like to submit a bug report, please visit:
>
> # http://bugreport.sun.com/bugreport/crash.jsp
>
>
> Steve
>
>
> ------
> *From:* andy petrella [andy.petre...@gmail.com]
> *Sent:* Thursday, April 17, 2014 3:21 PM
> *To:* user@sp
rt/crash.jsp
Steve
From: andy petrella [andy.petre...@gmail.com]
Sent: Thursday, April 17, 2014 3:21 PM
To: user@spark.apache.org
Subject: Re: Spark 0.9.1 core dumps on Mesos 0.18.0
No of course, but I was guessing some native libs imported (to communicate with
Mesos)
No of course, but I was guessing some native libs imported (to communicate
with Mesos) in the project that... could miserably crash the JVM.
Anyway, so you tell us that using this oracle version, you don't have any
issues when using spark on mesos 0.18.0, that's interesting 'cause AFAIR,
my last t
I don't know if it's anything you or the project is missing... that's
just a JDK bug.
FWIW I am on 1.7.0_51 and have not seen anything like that.
I don't think it's a protobuf issue -- you don't crash the JVM with
simple version incompatibilities :)
--
Sean Owen | Director, Data Science | London
Hyea,
I still have to try it myself (I'm trying to create GCE images with Spark
on Mesos 0.18.0) but I think your change is one of the required ones,
however my gut feeling is that others will be required to have this working.
Actually, in my understanding, this core dump is due to protobuf
incom
So I tried a fix found on the list...
"The issue was due to meos version mismatch as I am using latest mesos
0.17.0, but spark uses 0.13.0.
Fixed by updating the SparkBuild.scala to latest version."
I changed this line in SparkBuild.scala
"org.apache.mesos" % "mesos"
13 matches
Mail list logo