.html#spark-dynamic-allocation)
>> you can request Spark Dynamic Resource Allocation as the default
>> configuration at cluster creation.
>>
>>
>>
>> Best regards,
>>
>> Christopher
>>
>>
>>
>>
>>
>> *From:*
;
> *From:* Dinesh Ranganathan [mailto:dineshranganat...@gmail.com]
> *Sent:* Monday, November 16, 2015 4:57 AM
> *To:* Sabarish Sasidharan
> *Cc:* user
> *Subject:* Re: Spark Expand Cluster
>
>
>
> Hi Sab,
>
>
>
> I did not specify number of executors when I
Allocation as the default configuration
at cluster creation.
Best regards,
Christopher
From: Dinesh Ranganathan [mailto:dineshranganat...@gmail.com]
Sent: Monday, November 16, 2015 4:57 AM
To: Sabarish Sasidharan
Cc: user
Subject: Re: Spark Expand Cluster
Hi Sab,
I did not specify number of executors
Hi Sab,
I did not specify number of executors when I submitted the spark
application. I was in the impression spark looks at the cluster and figures
out the number of executors it can use based on the cluster size
automatically, is this what you call dynamic allocation?. I am spark
newbie, so apol
Spark will use the number of executors you specify in spark-submit. Are you
saying that Spark is not able to use more executors after you modify it in
spark-submit? Are you using dynamic allocation?
Regards
Sab
On Mon, Nov 16, 2015 at 5:54 PM, dineshranganathan <
dineshranganat...@gmail.com> wrot