t;>> Marcelo
>
>
>
> --
> Marcelo
>
> -------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>
>
>
> ____
> If
>From spark-submit --help:
YARN-only:
--executor-cores NUMNumber of cores per executor (Default: 1).
--queue QUEUE_NAME The YARN queue to submit to (Default: "default").
--num-executors NUM Number of executors to launch (Default: 2).
--archives ARCHIVES Co
Thank you.
Where is the number of containers set?
On Thu, Sep 25, 2014 at 7:17 PM, Marcelo Vanzin wrote:
> On Thu, Sep 25, 2014 at 8:55 AM, jamborta wrote:
>> I am running spark with the default settings in yarn client mode. For some
>> reason yarn always allocates three containers to the appli
On Thu, Sep 25, 2014 at 8:55 AM, jamborta wrote:
> I am running spark with the default settings in yarn client mode. For some
> reason yarn always allocates three containers to the application (wondering
> where it is set?), and only uses two of them.
The default number of executors in Yarn mode
high spark.cores.max. Is there some additional settings I
am missing?
thanks,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Yarn-number-of-containers-tp15148.html
Sent from the Apache Spark User List mailing list archive at Nabble.com