Hi,

You should not directly use these JVM options, and
you can use `spark.executor.memory` and `spark.driver.memory` for the
optimization.

// maropu

On Thu, Apr 14, 2016 at 11:32 AM, Divya Gehlot <divya.htco...@gmail.com>
wrote:

> Hi,
> I am using Spark 1.5.2 with Scala 2.10 and my Spark job keeps failing with
> exit code 143 .
> except one job where I am using unionAll and groupBy operation on multiple
> columns .
>
> Please advice me the options to optimize it .
> The one option which I am using it now
> --conf spark.executor.extraJavaOptions  -XX:MaxPermSize=1024m
> -XX:PermSize=256m --conf spark.driver.extraJavaOptions
>  -XX:MaxPermSize=1024m -XX:PermSize=256m --conf
> spark.yarn.executor.memoryOverhead=1024
>
> Need to know the best practices/better ways to optimize code.
>
> Thanks,
> Divya
>
>


-- 
---
Takeshi Yamamuro

Reply via email to