Hi Bcjaes,
Sorry I didn't see the previous thread so not sure what issues you are running
into.
In cluster mode the driver logs and results are all available through the Mesos
UI, you need to look at terminated frameworks if it's a job that's already
finished.
I'll try to add more docs as we
I'm currently having the same issues. The documentation for Mesos dispatcher
is sparse. I'll also add that I'm able to see the framework running in the
mesos and spark driver UIs, but when viewing the spark job ui on a slave, no
job is seen.
--
View this message in context:
http://apache-sp
gt; cluster mode , is it still possible to create a Dispatcher using 1.4 and
>> run 1.3 using that dispatcher ?
>> --
>> *From:* Jerry Lam [chiling...@gmail.com]
>> *Sent:* Monday, July 20, 2015 8:27 AM
>> *To:* Jahagirdar, Madhu
>> *C
sing that dispatcher ?
> --
> *From:* Jerry Lam [chiling...@gmail.com]
> *Sent:* Monday, July 20, 2015 8:27 AM
> *To:* Jahagirdar, Madhu
> *Cc:* user; d...@spark.apache.org
> *Subject:* Re: Spark Mesos Dispatcher
>
> Yes.
>
> Sent from my iPhone
>
> On 19 Jul,
, Madhu
Cc: user; d...@spark.apache.org
Subject: Re: Spark Mesos Dispatcher
Yes.
Sent from my iPhone
On 19 Jul, 2015, at 10:52 pm, "Jahagirdar, Madhu"
mailto:madhu.jahagir...@philips.com>> wrote:
All,
Can we run different version of Spark using the same Mesos Dispatcher. For
Yes.
Sent from my iPhone
> On 19 Jul, 2015, at 10:52 pm, "Jahagirdar, Madhu"
> wrote:
>
> All,
>
> Can we run different version of Spark using the same Mesos Dispatcher. For
> example we can run drivers with Spark 1.3 and Spark 1.4 at the same time ?
>
> Regards,
> Madhu Jahagirdar
>
> Th