[ 
https://issues.apache.org/jira/browse/SPARK-51464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

prikshit updated SPARK-51464:
-----------------------------
    Description: 
I have enabled authentication in my Spark standalone cluster by setting 
{{spark.authenticate=true}} in {{{}spark-defaults.conf{}}}. I am submitting my 
job using {{spark-submit}} and passing the secret as {{{}--conf 
spark.authenticate.secret=somesecretvalue{}}}. Below is the command I use to 
submit my job:
 
{{spark-submit --class MySampleClass --conf 
spark.authenticate.secret=somesecretvalue --supervise --conf 
spark.submit.deployMode=cluster}}

We are able to successfully submit the job and also in application we are 
setting the secret inSspark context , our application is up and runnning.
 
{{SparkConf sparkConf = new SparkConf();
sparkConf.set('spark.authenticate.secret','somesecretvalue')}}

Now to kill the application, I use two different options, each option is giving 
respective exceptions.

I tried with 2 options

Option - 1. {{spark-class org.apache.spark.deploy.Client kill 
<ACTIVE_MASTR_NODE>:7077 DRIVER_SUBMISSION_ID}} has stopped working with below 
error:
 
{{Exception in thread "main" org.apache.spark.SparkException: Exception thrown 
in awaitResult:
at 
org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56)
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
at org.apache.spark.deploy.Client$.main(Client.scala:274)
at org.apache.spark.deploy.Client.main(Client.scala)
Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: 
Expected SaslMessage, received something else (maybe your client does not have 
SASL enabled?)

  was:
I have enabled authentication in my Spark standalone cluster by setting 
{{spark.authenticate=true}} in {{{}spark-defaults.conf{}}}. I am submitting my 
job using {{spark-submit}} and passing the secret as {{{}--conf 
spark.authenticate.secret=somesecretvalue{}}}. Below is the command I use to 
submit my job:
 
{{spark-submit --class MySampleClass --conf 
spark.authenticate.secret=somesecretvalue --supervise --conf 
spark.submit.deployMode=cluster}}

We are able to successfully submit the job and also in application we are 
setting the secret inSspark context , our application is up and runnning.
 
{{SparkConf sparkConf = new SparkConf();
sparkConf.set('spark.authenticate.secret','somesecretvalue')}}

Now to kill the application, I use two different options, each option is giving 
respective exceptions.

I tried with 2 options

Option - 1. {{spark-class org.apache.spark.deploy.Client kill 
<ACTIVE_MASTR_NODE>:7077 DRIVER_SUBMISSION_ID}} has stopped working with below 
error:
 
{{Exception in thread "main" org.apache.spark.SparkException: Exception thrown 
in awaitResult:
        at 
org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56)
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
        at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
        at 
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
        at 
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
        at scala.collection.TraversableLike.map(TraversableLike.scala:286)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
        at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
        at org.apache.spark.deploy.Client$.main(Client.scala:274)
        at org.apache.spark.deploy.Client.main(Client.scala)
Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: 
Expected SaslM}}


> unable-to-stop-spark-application-after-enabling-authentication
> --------------------------------------------------------------
>
>                 Key: SPARK-51464
>                 URL: https://issues.apache.org/jira/browse/SPARK-51464
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.5.4
>         Environment: 3 node Spark Cluster 
> All Master (1 Alive , 2 StandBy)
> All Workers
>            Reporter: prikshit
>            Priority: Major
>             Fix For: 3.5.4
>
>
> I have enabled authentication in my Spark standalone cluster by setting 
> {{spark.authenticate=true}} in {{{}spark-defaults.conf{}}}. I am submitting 
> my job using {{spark-submit}} and passing the secret as {{{}--conf 
> spark.authenticate.secret=somesecretvalue{}}}. Below is the command I use to 
> submit my job:
>  
> {{spark-submit --class MySampleClass --conf 
> spark.authenticate.secret=somesecretvalue --supervise --conf 
> spark.submit.deployMode=cluster}}
> We are able to successfully submit the job and also in application we are 
> setting the secret inSspark context , our application is up and runnning.
>  
> {{SparkConf sparkConf = new SparkConf();
> sparkConf.set('spark.authenticate.secret','somesecretvalue')}}
> Now to kill the application, I use two different options, each option is 
> giving respective exceptions.
> I tried with 2 options
> Option - 1. {{spark-class org.apache.spark.deploy.Client kill 
> <ACTIVE_MASTR_NODE>:7077 DRIVER_SUBMISSION_ID}} has stopped working with 
> below error:
>  
> {{Exception in thread "main" org.apache.spark.SparkException: Exception 
> thrown in awaitResult:
> at 
> org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56)
> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310)
> at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
> at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
> at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
> at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
> at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
> at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
> at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
> at scala.collection.TraversableLike.map(TraversableLike.scala:286)
> at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
> at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
> at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
> at org.apache.spark.deploy.Client$.main(Client.scala:274)
> at org.apache.spark.deploy.Client.main(Client.scala)
> Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: 
> Expected SaslMessage, received something else (maybe your client does not 
> have SASL enabled?)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to