Hi Anders,

I faced the same issue as you mentioned. Yes, you need to install
spark shuffle plugin for YARN. Please check following PRs which add
doc to enable dynamicAllocation:

https://github.com/apache/spark/pull/3731
https://github.com/apache/spark/pull/3757

I could run Spark on YARN with dynamicAllocation by following the
instructions described in the docs.

Thanks,
- Tsuyoshi

On Sat, Dec 27, 2014 at 11:06 PM, Anders Arpteg <arp...@spotify.com> wrote:
> Hey,
>
> Tried to get the new spark.dynamicAllocation.enabled feature working on Yarn
> (Hadoop 2.2), but am unsuccessful so far. I've tested with the following
> settings:
>
>       conf
>         .set("spark.dynamicAllocation.enabled", "true")
>         .set("spark.shuffle.service.enabled", "true")
>         .set("spark.dynamicAllocation.minExecutors", "10")
>         .set("spark.dynamicAllocation.maxExecutors", "700")
>
> The app works fine on Spark 1.2 if dynamicAllocation is not enabled, but
> with the settings above, it will start the app and the first job is listed
> in the web ui. However, no tasks are started and it seems to be stuck
> waiting for a container to be allocated forever.
>
> Any help would be appreciated. Need to do something specific to get the
> external yarn shuffle service running in the node manager?
>
> TIA,
> Anders



-- 
- Tsuyoshi

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to