Re: Spark Expand Cluster

2015-12-01 Thread Alexander Pivovarov
.html#spark-dynamic-allocation) >> you can request Spark Dynamic Resource Allocation as the default >> configuration at cluster creation. >> >> >> >> Best regards, >> >> Christopher >> >> >> >> >> >> *From:*

Re: Spark Expand Cluster

2015-11-24 Thread Dinesh Ranganathan
; > *From:* Dinesh Ranganathan [mailto:dineshranganat...@gmail.com] > *Sent:* Monday, November 16, 2015 4:57 AM > *To:* Sabarish Sasidharan > *Cc:* user > *Subject:* Re: Spark Expand Cluster > > > > Hi Sab, > > > > I did not specify number of executors when I

RE: Spark Expand Cluster

2015-11-20 Thread Bozeman, Christopher
Allocation as the default configuration at cluster creation. Best regards, Christopher From: Dinesh Ranganathan [mailto:dineshranganat...@gmail.com] Sent: Monday, November 16, 2015 4:57 AM To: Sabarish Sasidharan Cc: user Subject: Re: Spark Expand Cluster Hi Sab, I did not specify number of executors

Re: Spark Expand Cluster

2015-11-16 Thread Dinesh Ranganathan
Hi Sab, I did not specify number of executors when I submitted the spark application. I was in the impression spark looks at the cluster and figures out the number of executors it can use based on the cluster size automatically, is this what you call dynamic allocation?. I am spark newbie, so apol

Re: Spark Expand Cluster

2015-11-16 Thread Sabarish Sasidharan
Spark will use the number of executors you specify in spark-submit. Are you saying that Spark is not able to use more executors after you modify it in spark-submit? Are you using dynamic allocation? Regards Sab On Mon, Nov 16, 2015 at 5:54 PM, dineshranganathan < dineshranganat...@gmail.com> wrot