This page, http://spark.apache.org/docs/latest/running-on-mesos.html,
covers many of these questions. If you submit a job with the option
"--supervise", it will be restarted if it fails.

You can use Chronos for scheduling. You can create a single streaming job
with a 10 minute batch interval, if that works for your every 10-min. need.

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Wed, Jul 22, 2015 at 3:53 AM, boci <boci.b...@gmail.com> wrote:

> Hi guys!
>
> I'm a new in mesos. I have two spark application (one streaming and one
> batch). I want to run both app in mesos cluster. Now for testing I want to
> run in docker container so I started a simple redjack/mesos-master, but I
> think a lot of think unclear for me (both mesos and spark-mesos).
>
> If I have a mesos cluster (for testing it will be some docker container) i
> need a separate machine (container) to run my spark job? Or can I submit
> the cluster and schedule (chronos or I dunno)?
> How can I run the streaming job? What happened if the "controller" died?
> Or if I call spark-submit with master=mesos my application started and I
> can forget? How can I run in every 10 min without submit in every 10 min?
> How can I run my streaming app in HA mode?
>
> Thanks
>
> b0c1
>
>
> ----------------------------------------------------------------------------------------------------------------------------------
> Skype: boci13, Hangout: boci.b...@gmail.com
>

Reply via email to