thanks.
On Thu, Sep 25, 2014 at 10:25 PM, Marcelo Vanzin [via Apache Spark
User List] wrote:
> From spark-submit --help:
>
> YARN-only:
> --executor-cores NUMNumber of cores per executor (Default: 1).
> --queue QUEUE_NAME The YARN queue to submit to (Default:
> "default").
>From spark-submit --help:
YARN-only:
--executor-cores NUMNumber of cores per executor (Default: 1).
--queue QUEUE_NAME The YARN queue to submit to (Default: "default").
--num-executors NUM Number of executors to launch (Default: 2).
--archives ARCHIVES Co
Thank you.
Where is the number of containers set?
On Thu, Sep 25, 2014 at 7:17 PM, Marcelo Vanzin wrote:
> On Thu, Sep 25, 2014 at 8:55 AM, jamborta wrote:
>> I am running spark with the default settings in yarn client mode. For some
>> reason yarn always allocates three containers to the appli
On Thu, Sep 25, 2014 at 8:55 AM, jamborta wrote:
> I am running spark with the default settings in yarn client mode. For some
> reason yarn always allocates three containers to the application (wondering
> where it is set?), and only uses two of them.
The default number of executors in Yarn mode