Divya,

According to my recent Spark tuning experiences, optimal executor-memory
size not only depends on your workload characteristics (e.g. working set
size at each job stage) and input data size, but also depends on your total
available memory and memory requirements of other components like driver
(also depends on how your workload interacts with driver) and underlying
storage. In my opinion, it may be difficult to derive one generic and easy
formular to describe all the dynamic relationships.


Best Regards,
Jia

On Wed, Feb 3, 2016 at 12:13 AM, Divya Gehlot <divya.htco...@gmail.com>
wrote:

> Hi,
>
> I would like to know how to calculate how much  -executor-memory should we
> allocate , how many num-executors,total-executor-cores we should give while
> submitting spark jobs .
> Is there any formula for it ?
>
>
> Thanks,
> Divya
>

Reply via email to