I assume you're running YARN given the exception.

I don't know if this is covered in the documentation (I took a quick
look at the config document and didn't see references to it), but you
need to configure Spark's external shuffle service as and auxiliary
nodemanager service in your YARN cluster. That involves deploying the
Spark shuffle service jar to all the NMs, and changing YARN's
configuration to start the service (which should be called
"spark_shuffle"). Please look at YARN's docs for details about how to
set it up.

On Tue, Mar 17, 2015 at 7:07 PM, Sea <261810...@qq.com> wrote:
> Hi, all:
>
>
> Spark1.3.0 hadoop2.2.0
>
>
> I put the following params in the spark-defaults.conf
>
>
> spark.dynamicAllocation.enabled true
> spark.dynamicAllocation.minExecutors 20
> spark.dynamicAllocation.maxExecutors 300
> spark.dynamicAllocation.executorIdleTimeout 300
> spark.shuffle.service.enabled true‍
>
>
>
> I started the thriftserver and do a query. Exception happened!
> I find it in JIRA https://issues.apache.org/jira/browse/SPARK-5759‍
> It says fixed version 1.3.0
>
>
> Caused by: org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The 
> auxService:spark_shuffle does not exist    at 
> sun.reflect.GeneratedConstructorAccessor28.newInstance(Unknown Source)       
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>          at java.lang.reflect.Constructor.newInstance(Constructor.java:513)   
>    at 
> org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:152)
>         at 
> org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
>          at 
> org.apache.hadoop.yarn.client.api.impl.NMClientImpl.startContainer(NMClientImpl.java:203)
>     at 
> org.apache.spark.deploy.yarn.ExecutorRunnable.startContainer(ExecutorRunnable.scala:113)
>      ... 4 more‍



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to