Hello.
The follwing thread may help you.
http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Can-not-configure-driver-memory-size-td1513.html


2016-01-30 22:45 GMT+09:00 shahab <shahab.mok...@gmail.com>:

> Hi,
>
> I am running Zeppelin on Amazon EMR spark and I am keep facing the "out of
> memory" problem while loading large csv file.
> The zeppelin by default has set 512 MB for driver executor and 142 MB for
> two executors.
> I tried to increase them by placing the following configuration params in
> "zeppelin-env.sh", but it had no effect.
>
> --conf driver-memory=6g --conf spark.executor.memory=6g
>
> I do appreciate if you could share your comments and experience on how to
> fix this.
>
> best,
> /Shahab
>
>

Reply via email to