Hi everyone,
I am using Zeppelin in AWS EMR (Zeppelin 0.6.1, spark 2.0 on Yarn RM)
Basically Zeppelin spark interpreter's spark job is not finishing after
executing a notebook.
It looks like the spark job still occupying memory a lot in my Yarn cluster.
Is there a way restart spark interpreter aut
cutors very coarsely grained since there's only one per node. It would
> still allow multiple applications to run at once though because executors
> from one application could spin down when idle, allowing another
> application to spin up executors.
>
> Hope this helps,
> Jonath
Hi zeppelin users,
I have an issue when I upgraded zeppelin to 0.7.2 from 0.6.2
I am using spark-redis 0.3.2 library to load redis values.
To use that library, I have to set "redis.host" property on spark config
instance
It used to work on zeppelin 0.6.2 but not in 0.7.2.
How can I set spark co
ache.org/jira/browse/ZEPPELIN-2893 for that.
> And will fix it in 0.7.3
>
> Jung, Soonoh 于2017年9月1日周五 下午5:33写道:
>
>> Hi zeppelin users,
>>
>> I have an issue when I upgraded zeppelin to 0.7.2 from 0.6.2
>>
>> I am using spark-redis 0.3.2 library to load redi