It isn't mentioned anywhere in the doc
<https://spark.apache.org/docs/latest/spark-standalone.html#high-availability>,
but you will probably need separate ZK for each of your HA cluster.

Thanks
Best Regards

On Wed, Apr 22, 2015 at 12:02 AM, Michal Klos <michal.klo...@gmail.com>
wrote:

> Hi,
>
> I'm trying to set up multiple spark clusters with high availability and I
> was wondering if I can re-use a single ZK cluster to manage them? It's not
> very clear in the docs and it seems like the answer may be that I need a
> separate ZK cluster for each spark cluster?
>
> thanks,
> M
>

Reply via email to