Hi Spark Dev’s,

I am trying out the new Spark’s internal authentication mechanism based off
AES encryption, https://issues.apache.org/jira/browse/SPARK-19139 which has
come up in Spark 2.2.0.

I set the following properties in my spark-defaults:
spark.network.crypto.enabled true
spark.network.crypto.saslFallback false
spark.authenticate               true

This seems to work fine with internal shuffle service of Spark. However,
when in I try it with Yarn’s external shuffle service the executors are
unable to register with the shuffle service as it still expects SASL
authentication. Here is the error I get:

ExecutorLostFailure (executor 42 exited caused by one of the running tasks)
Reason: Unable to create executor due to Unable to register with external
shuffle server due to : java.lang.IllegalStateException: Expected
SaslMessage, received something else (maybe your client does not have SASL
enabled?)
                at
org.apache.spark.network.sasl.SaslMessage.decode(SaslMessage.java:69)
                at
org.apache.spark.network.sasl.SaslRpcHandler.receive(SaslRpcHandler.java:89)
                at
org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
                at
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
                at
org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)

Can someone confirm that this is expected behavior? Or provide some
guidance, on how I can make it work with external shuffle service ?

Note: If I set ‘spark.network.crypto.saslFallback’ to true, the job runs
fine with external shuffle service as well since its falling back to sasl
authentication.

Thank you for your help.

Reply via email to