Re: Spark 1.5.1 Dynamic Resource Allocation

2015-11-09 Thread Andrew Or
Hi Tom, I believe a workaround is to set `spark.dynamicAllocation.initialExecutors` to 0. As others have mentioned, from Spark 1.5.2 onwards this should no longer be necessary. -Andrew 2015-11-09 8:19 GMT-08:00 Jonathan Kelly : > Tom, > > You might be hitting https://issues.apache.org/jira/brow

Re: Spark 1.5.1 Dynamic Resource Allocation

2015-11-09 Thread Jonathan Kelly
Tom, You might be hitting https://issues.apache.org/jira/browse/SPARK-10790, which was introduced in Spark 1.5.0 and fixed in 1.5.2. Spark 1.5.2 just passed release candidate voting, so it should be tagged, released and announced soon. If you are able to build from source yourself and run with tha

Re: Spark 1.5.1 Dynamic Resource Allocation

2015-11-09 Thread Akhil Das
Did you go through http://spark.apache.org/docs/latest/job-scheduling.html#configuration-and-setup for yarn, i guess you will have to copy the spark-1.5.1-yarn-shuffle.jar to the classpath of all nodemanagers in your cluster. Thanks Best Regards On Fri, Oct 30, 2015 at 7:41 PM, Tom Stewart < stew

Re: Spark 1.5.1 Dynamic Resource Allocation

2015-11-04 Thread tstewart
https://issues.apache.org/jira/browse/SPARK-10790 Changed to add minExecutors < initialExecutors < maxExecutors and that works. spark-shell --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --conf spark.dynamicAllocation.minExecutors=2 --conf spark.dynamicAlloc

Spark 1.5.1 Dynamic Resource Allocation

2015-11-04 Thread tstewart
(apologies if this re-posts, having challenges with the various web front ends to this mailing list) I am running the following command on a Hadoop cluster to launch Spark shell with DRA: spark-shell --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --conf spa

Spark 1.5.1 Dynamic Resource Allocation

2015-10-30 Thread Tom Stewart
I am running the following command on a Hadoop cluster to launch Spark shell with DRA: spark-shell  --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --conf spark.dynamicAllocation.minExecutors=4 --conf spark.dynamicAllocation.maxExecutors=12 --conf spark.dy

Spark 1.5.1 Dynamic Resource Allocation

2015-10-29 Thread tstewart
I am running the following command on a Hadoop cluster to launch Spark shell with DRA: spark-shell --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --conf spark.dynamicAllocation.minExecutors=4 --conf spark.dynamicAllocation.maxExecutors=12 --conf spark.dynamic