Hi,
Have you see a slide in spark summit 2016?
https://spark-summit.org/2016/events/top-5-mistakes-when-writing-spark-applications/
This is a good slide for your capacity planning.
// maropu
On Tue, Jul 12, 2016 at 2:31 PM, Yash Sharma wrote:
> I would say use the dynamic allocation rather tha
I would say use the dynamic allocation rather than number of executors.
Provide some executor memory which you would like.
Deciding the values requires couple of test runs and checking what works
best for you.
You could try something like -
--driver-memory 1G \
--executor-memory 2G \
--executor-c
That configuration looks bad. With only two cores in use and 1GB used by
the app. Few points-
1. Please oversubscribe those CPUs to at-least twice the amount of cores
you have to start-with and then tune if it freezes
2. Allocate all of the CPU cores and memory to your running app (I assume
it is