I'm actually surprised your memory is that high. Spark only allocates
spark.storage.memoryFraction for storing RDDs. This defaults to .6, so 32
GB * .6 * 10 executors should be a total of 192 GB.
-Sandy
On Sat, Sep 20, 2014 at 8:21 AM, Soumya Simanta
wrote:
> There 128 cores on each box. Yes t
There 128 cores on each box. Yes there are other applications running on
the cluster. YARN is assigning two containers to my application. I'll
investigate this a little more. PS: I'm new to YARN.
On Fri, Sep 19, 2014 at 4:49 PM, Vipul Pandey wrote:
> How many cores do you have in your boxes?
>
How many cores do you have in your boxes?
looks like you are assigning 32 cores "per" executor - is that what you want?
are there other applications running on the cluster? you might want to check
YARN UI to see how many containers are getting allocated to your application.
On Sep 19, 2014, a