henry
Date: Tuesday, February 14, 2017 at 1:04 AM
To: Conversant
Cc: Jon Gregg , "user @spark"
Subject: Re: Lost executor 4 Container killed by YARN for exceeding memory
limits.
Hi,
How to set this parameters while launching spark shell
spark.shuffle.memoryFractio
GB.
>
> You may want to consult with your DevOps/Operations/Spark Admin team first.
>
>
>
> *From: *Jon Gregg
> *Date: *Monday, February 13, 2017 at 8:58 AM
> *To: *nancy henry
> *Cc: *"user @spark"
> *Subject: *Re: Lost executor 4 Container killed by YARN for
Date: Monday, February 13, 2017 at 8:58 AM
To: nancy henry
Cc: "user @spark"
Subject: Re: Lost executor 4 Container killed by YARN for exceeding memory
limits.
Setting Spark's memoryOverhead configuration variable is recommended in your
logs, and has helped me with these iss
Setting Spark's memoryOverhead configuration variable is recommended in
your logs, and has helped me with these issues in the past. Search for
"memoryOverhead" here:
http://spark.apache.org/docs/latest/running-on-yarn.html
That said, you're running on a huge cluster as it is. If it's possible to