(I'm not 100% sure, but...) I think the SPARK_EXECUTOR_* environment
variables are intended to be used with Spark Standalone. Even if not, I'd
recommend setting the corresponding properties in spark-defaults.conf
rather than in spark-env.sh.
For example, you may use the following Configuration obj
May set "maximizeResourceAllocation", then EMR will do the best config for
you.
http://docs.aws.amazon.com/ElasticMapReduce/latest/ReleaseGuide/emr-spark-configure.html
Jingyu
On 18 February 2016 at 12:02, wrote:
> Hi All,
>
> I have been facing memory issues in spark. im using spark-sql on AWS
Give your config the cluster manager can only give 2 Executors.
Looking at m3.2xlarge --> its is with 30 GB Memory . you have 3
*m3.2xlarge which means you have total of 3 * 30 Gb memory for executor.
15 GB for 16 executor would require 15 * 16 GB. Also check executor the
number of core you are s