nop, there is no "distribution", no spark-submit at the start of my process.But
I found the problem, the behavior when loading mesos native dependency changed,
and the static initialization block inside org.apache.mesos.MesosSchedulerDriver
needed the specific reference to libmesos-1.0.0.so.
So just for the record, setting the env variable
MESOS_NATIVE_JAVA_LIBRARY="/<path to where your mesos libs
are>/libmesos-1.0.0.so" fixed the whole thing.
Thanks for the help !
@michael if you want to talk about the setup we're using, we can talk about it
directly .
 





On Tue, Jan 10, 2017 9:31 PM, Michael Gummelt mgumm...@mesosphere.io
wrote:
What do you mean your driver has all the dependencies packaged?  What are "all
the dependencies"?  Is the distribution you use to launch your driver built with
-Pmesos?

On Tue, Jan 10, 2017 at 12:18 PM, Olivier Girardot <
o.girar...@lateral-thoughts.com>  wrote:
Hi Michael,I did so, but it's not exactly the problem, you see my driver has all
the dependencies packaged, and only the executors fetch via the
spark.executor.uri the tgz,The strange thing is that I see in my classpath the
org.apache.mesos:mesos-1.0.0-shaded-protobuf dependency packaged in the final
dist of my app…So everything should work in theory.

 





On Tue, Jan 10, 2017 7:22 PM, Michael Gummelt mgumm...@mesosphere.io
wrote:
Just build with -Pmesos 
http://spark.apache.org/docs/latest/building-spark.html#
building-with-mesos-support

On Tue, Jan 10, 2017 at 8:56 AM, Olivier Girardot <
o.girar...@lateral-thoughts.com>  wrote:
I had the same problem, added spark-mesos as dependency and now I get :
[2017-01-10 17:45:16,575] {bash_operator.py:77} INFO - Exception in thread
"main" java.lang.NoClassDefFoundError: Could not initialize class
org.apache.mesos.MesosSchedulerDriver[2017-01-10 17:45:16,576]
{bash_operator.py:77} INFO - at org.apache.spark.scheduler.clu
ster.mesos.MesosSchedulerUtils$class.createSchedulerDriver(M
esosSchedulerUtils.scala:105)[2017-01-10 17:45:16,576] {bash_operator.py:77}
INFO - at org.apache.spark.scheduler.cluster.mesos.MesosCoarseGrainedS
chedulerBackend.createSchedulerDriver(MesosCoarseGrainedSchedulerBackend.
scala:48)[2017-01-10 17:45:16,576] {bash_operator.py:77} INFO - at
org.apache.spark.scheduler.cluster.mesos.MesosCoarseGrainedS
chedulerBackend.start(MesosCoarseGrainedSchedulerBackend.scala:155)[2017-01-10
17:45:16,577] {bash_operator.py:77} INFO - at org.apache.spark.scheduler.Tas
kSchedulerImpl.start(TaskSchedulerImpl.scala:156)[2017-01-10 17:45:16,577]
{bash_operator.py:77} INFO - at org.apache.spark.SparkContext.
<init>(SparkContext.scala:509)[2017-01-10 17:45:16,577] {bash_operator.py:77}
INFO - at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
[2017-01-10 17:45:16,577] {bash_operator.py:77} INFO - at
org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(
SparkSession.scala:868)[2017-01-10 17:45:16,577] {bash_operator.py:77} INFO - at
org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(
SparkSession.scala:860)[2017-01-10 17:45:16,578] {bash_operator.py:77} INFO - at
scala.Option.getOrElse(Option.scala:121)[2017-01-10 17:45:16,578]
{bash_operator.py:77} INFO - at org.apache.spark.sql.SparkSess
ion$Builder.getOrCreate(SparkSession.scala:860)
Is there any other dependency to add for spark 2.1.0 ?

 





On Tue, Jan 10, 2017 1:26 AM, Abhishek Bhandari abhi10...@gmail.com
wrote:
Glad that you found it.ᐧ
On Mon, Jan 9, 2017 at 3:29 PM, Richard Siebeling <rsiebel...@gmail.com>  wrote:
Probably found it, it turns out that Mesos should be explicitly added while
building Spark, I assumed I could use the old build command that I used for
building Spark 2.0.0... Didn't see the two lines added in the documentation...
Maybe these kind of changes could be added in the changelog under changes of
behaviour or changes in the build process or something like that,
kind regards,Richard

On 9 January 2017 at 22:55, Richard Siebeling <rsiebel...@gmail.com>  wrote:
Hi,
I'm setting up Apache Spark 2.1.0 on Mesos and I am getting a "Could not parse
Master URL: 'mesos://xx.xx.xxx.xxx:5050'" error.Mesos is running fine (both the
master as the slave, it's a single machine configuration).
I really don't understand why this is happening since the same configuration but
using a Spark 2.0.0 is running fine within Vagrant.Could someone please help?
thanks in advance,Richard






-- 
Abhishek J BhandariMobile No. +1 510 493 6205  (USA)
Mobile No. +91 96387 93021  (IND)R & D DepartmentValent Software Inc. CAEmail: 
abhis...@valent-software.com

 

Olivier Girardot| Associé
o.girar...@lateral-thoughts.com
+33 6 24 09 17 94
 


-- 
Michael Gummelt
Software Engineer
Mesosphere


Olivier Girardot| Associé
o.girar...@lateral-thoughts.com
+33 6 24 09 17 94
 


-- 
Michael Gummelt
Software Engineer
Mesosphere


Olivier Girardot| Associé
o.girar...@lateral-thoughts.com
+33 6 24 09 17 94

Reply via email to