Hi,

Recently dynamic allocation feature of YARN has been enabled on our cluster due 
to increase in workload. At the same time I upgraded Zeppelin to work with 
Spark 1.3.1.

Now the spark context that is created in the notebook is short lived. Every 
time I run some command it throws me an error saying, spark context has been 
stopped.

Do I have to provide some configurations in zeppelin-env.sh or interpreter 
settings to work with YARN dynamic allocation?



Regards,
Sambit.

Reply via email to