It did not helped , same error , Is this the issue i am running into
https://issues.apache.org/jira/browse/SPARK-11638
*Warning: Local jar /mnt/mesos/sandbox/spark-examples-1.6.0-hadoop2.6.0.jar
does not exist, skipping.*
java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
On Thu,
Ah I see, I think it's because you've launched the Mesos slave in a docker
container, and when you launch also the executor in a container it's not
able to mount in the sandbox to the other container since the slave is in a
chroot.
Can you try mounting in a volume from the host when you launch the
Hi Tim ,
I think I know the problem but i do not have a solution , *The Mesos Slave
supposed to download the Jars from the URI specified and placed in
$MESOS_SANDBOX location but it is not downloading not sure why* .. see
below logs
My command looks like below
docker run -it --rm -m 2g -e SPARK
You shouldn't need to specify --jars at all since you only have one jar.
The error is pretty odd as it suggests it's trying to load
/opt/spark/Example but that doesn't really seem to be anywhere in your
image or command.
Can you paste your stdout from the driver task launched by the cluster
dispa
See below and Attached the Dockerfile to build the spark image ( between
i just upgraded to 1.6 )
I am running below setup -
Mesos Master - Docker Container
Mesos Slave 1 - Docker Container
Mesos Slave 2 - Docker Container
Marathon - Docker Container
Spark MESOS Dispatcher - Docker Cont
@Tim yes, this is asking about 1.5 though
On Wed, Mar 2, 2016 at 2:35 PM Tim Chen wrote:
> Hi Charles,
>
> I thought that's fixed with your patch in latest master now right?
>
> Ashish, yes please give me your docker image name (if it's in the public
> registry) and what you've tried and I can s
Hi Charles,
I thought that's fixed with your patch in latest master now right?
Ashish, yes please give me your docker image name (if it's in the public
registry) and what you've tried and I can see what's wrong. I think it's
most likely just the configuration of where the Spark home folder is in
Re: Spark on Mesos Warning regarding disk space:
https://issues.apache.org/jira/browse/SPARK-12330
That's a spark flaw I encountered on a very regular basis on mesos. That
and a few other annoyances are fixed in
https://github.com/metamx/spark/tree/v1.5.2-mmx
Here's another mild annoyance I'v
I have no luck and i would to ask the question to spark committers will
this be ever designed to run on mesos ?
spark app as a docker container not working at all on mesos ,if any one
would like the code i can send it over to have a look.
Ashish
On Wed, Mar 2, 2016 at 12:23 PM, Sathish Kumaran
Try passing jar using --jars option
On Wed, Mar 2, 2016 at 10:17 AM Ashish Soni wrote:
> I made some progress but now i am stuck at this point , Please help as
> looks like i am close to get it working
>
> I have everything running in docker container including mesos slave and
> master
>
> When i
I made some progress but now i am stuck at this point , Please help as
looks like i am close to get it working
I have everything running in docker container including mesos slave and
master
When i try to submit the pi example i get below error
*Error: Cannot load main class from JAR file:/opt/spa
Can you go through the Mesos UI and look at the driver/executor log from steer
file and see what the problem is?
Tim
> On Mar 1, 2016, at 8:05 AM, Ashish Soni wrote:
>
> Not sure what is the issue but i am getting below error when i try to run
> spark PI example
>
> Blacklisting Mesos slave
Not sure what is the issue but i am getting below error when i try to run
spark PI example
Blacklisting Mesos slave value: "5345asdasdasdkas234234asdasdasdasd"
due to too many failures; is Spark installed on it?
WARN TaskSchedulerImpl: Initial job has not accepted any
resources; check your
May be the Mesos executor couldn't find spark image or the constraints are
not satisfied. Check your Mesos UI if you see Spark application in the
Frameworks tab
On Mon, Feb 29, 2016 at 12:23 PM Ashish Soni wrote:
> What is the Best practice , I have everything running as docker container
> in sin
What is the Best practice , I have everything running as docker container
in single host ( mesos and marathon also as docker container ) and
everything comes up fine but when i try to launch the spark shell i get
below error
SQL context available as sqlContext.
scala> val data = sc.parallelize(
No you don't have to run Mesos in docker containers to run Spark in docker
containers.
Once you have Mesos cluster running you can then specfiy the Spark
configurations in your Spark job (i.e:
spark.mesos.executor.docker.image=mesosphere/spark:1.6)
and Mesos will automatically launch docker contai
Yes i read that and not much details here.
Is it true that we need to have spark installed on each mesos docker
container ( master and slave ) ...
Ashish
On Fri, Feb 26, 2016 at 2:14 PM, Tim Chen wrote:
> https://spark.apache.org/docs/latest/running-on-mesos.html should be the
> best source, w
https://spark.apache.org/docs/latest/running-on-mesos.html should be the
best source, what problems were you running into?
Tim
On Fri, Feb 26, 2016 at 11:06 AM, Yin Yang wrote:
> Have you read this ?
> https://spark.apache.org/docs/latest/running-on-mesos.html
>
> On Fri, Feb 26, 2016 at 11:03
Hi All ,
Is there any proper documentation as how to run spark on mesos , I am
trying from the last few days and not able to make it work.
Please help
Ashish
Have you read this ?
https://spark.apache.org/docs/latest/running-on-mesos.html
On Fri, Feb 26, 2016 at 11:03 AM, Ashish Soni wrote:
> Hi All ,
>
> Is there any proper documentation as how to run spark on mesos , I am
> trying from the last few days and not able to make it work.
>
> Please help
20 matches
Mail list logo