Whoa, wait, the docker scripts are only used for testing purposes right
now. They have not been designed with the intention of replacing the
spark-ec2 scripts. For instance, there isn't an ssh server running so you
can stop and restart the cluster (like sbin/stop-all.sh). Also, we
currently mount SPARK_HOME into the containers such that recompiling spark
in one location will allow all containers on that machine to run against
the updated jars (which is great for testing but maybe not so much for
production).

It is possible to get these scripts up to a state where they are generally
usable for cluster deployment, but that would take some probably nontrivial
effort and as far as I know no one is actively working towards that goal.
Also, I'd take a look at the amplab-docker spark scripts, which are more
fleshed out: https://github.com/amplab/docker-scripts


On Sun, Mar 9, 2014 at 9:33 AM, Aureliano Buendia <buendia...@gmail.com>wrote:

> Hi,
>
>
> Is the spark docker script now mature enough to substitute spark-ec2
> script? Anyone here using the docker script is production?
>

Reply via email to