Hi, right now, I'm not aware of such configuration in Zeppelin (please, feel free to open the issue\submit a patch).
AFAIK dynamic YARN resource allocation is up to the user and is not configured by default right now, which looks like one possible solution to the problem you describe (at least cpu-wise) As a workaround for your usecase, you can manually re-start Spark interpreter (Interpreter menu -> restart), which, because of lazy-loading, will not occupy any resources until somebody actually runs it. Hope this helps! On Tue, Jun 30, 2015 at 8:23 AM, Litt, Shaun <sl...@conversantmedia.com> wrote: > Hi, I am new to zeppelin and just got it configured to run in my YARN > cluster, but I was wondering if there is a configuration or even a hard > setting that shuts down interpreters after in-activity. It seems like the > interpreter (and it’s yarn consumption) hang out indefinitely, ideally there > would be a clean way (like logout or a shutdown button within the notebook) > to shutdown these interpreters, but additionally there should be a way for > an admin of zeppelin to impose and idle timeout. As a note to the scope of > this, is dynamic yarn resource allocation configured (such that once a > paragraph finishes it can release vcores)? > > > > Thanks, > > Shaun > > > > This email and any files included with it may contain privileged, > proprietary and/or confidential information that is for the sole use > of the intended recipient(s). Any disclosure, copying, distribution, > posting, or use of the information contained in or attached to this > email is prohibited unless permitted by the sender. If you have > received this email in error, please immediately notify the sender > via return email, telephone, or fax and destroy this original transmission > and its included files without reading or saving it in any manner. > Thank you. -- -- Kind regards, Alexander.