change your spark settings so that the REPL does not get the whole cluster. 
e.g. by reducing the executor memory and cpu allocation.

Mohit Jaggi
Founder,
Data Orchard LLC
www.dataorchardllc.com




> On Oct 5, 2016, at 11:02 AM, Mark Libucha <mlibu...@gmail.com> wrote:
> 
> Hi everyone,
> 
> I've got Zeppelin running against a Cloudera/Yarn/Spark cluster and 
> everything seems to be working fine. Very cool.
> 
> One minor issue, though. When one notebook is running, others queue up behind 
> it. Is there a way to run multiple notebooks concurrently? Both notebooks are 
> running the pyspark interpreter.
> 
> Thanks,
> 
> Mark
> 

Reply via email to