I'd suggest asking about this on the Mesos list (CCed). As far as I know, there 
was actually some ongoing work for this.

Matei

> On Dec 3, 2014, at 9:46 AM, Dick Davies <d...@hellooperator.net> wrote:
> 
> Just wondered if anyone had managed to start spark
> jobs on mesos wrapped in a docker container?
> 
> At present (i.e. very early testing) I'm able to submit executors
> to mesos via spark-submit easily enough, but they fall over
> as we don't have a JVM on our slaves out of the box.
> 
> I can push one out via our CM system if push comes to shove,
> but it'd be nice to have that as part of the job (I'm thinking it might
> be a way to get some of the dependencies deployed too).
> 
> bear in mind I'm a total clueless newbie at this so please be gentle
> if I'm doing this completely wrong.
> 
> Thanks!
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to