Point to note as per docs as well :

*Note that jars or python files that are passed to spark-submit should be
URIs reachable by Mesos slaves, as the Spark driver doesn’t automatically
upload local jars.**http://spark.apache.org/docs/latest/running-on-mesos.html
<http://spark.apache.org/docs/latest/running-on-mesos.html> *

On Wed, May 11, 2016 at 10:05 PM, Giri P <gpatc...@gmail.com> wrote:

> I'm not using docker
>
> On Wed, May 11, 2016 at 8:47 AM, Raghavendra Pandey <
> raghavendra.pan...@gmail.com> wrote:
>
>> By any chance, are you using docker to execute?
>> On 11 May 2016 21:16, "Raghavendra Pandey" <raghavendra.pan...@gmail.com>
>> wrote:
>>
>>> On 11 May 2016 02:13, "gpatcham" <gpatc...@gmail.com> wrote:
>>>
>>> >
>>>
>>> > Hi All,
>>> >
>>> > I'm using --jars option in spark-submit to send 3rd party jars . But I
>>> don't
>>> > see they are actually passed to mesos slaves. Getting Noclass found
>>> > exceptions.
>>> >
>>> > This is how I'm using --jars option
>>> >
>>> > --jars hdfs://namenode:8082/user/path/to/jar
>>> >
>>> > Am I missing something here or what's the correct  way to do ?
>>> >
>>> > Thanks
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Not-able-pass-3rd-party-jars-to-mesos-executors-tp26918.html
>>> > Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com.
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>> >
>>>
>>
>

Reply via email to