[ 
https://issues.apache.org/jira/browse/SPARK-48334?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

IsisPolei resolved SPARK-48334.
-------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

This issue does't exist in the master branch. The SparkEnv changed the 
initialization logic since 4.x, the memoryManager instanced alternatively . If 
anyone still use spark 3.x and encounter the same issue this change may give 
some ideas.

https://github.com/apache/spark/pull/50661

> NettyServer doesn't shutdown if SparkContext initialize failed
> --------------------------------------------------------------
>
>                 Key: SPARK-48334
>                 URL: https://issues.apache.org/jira/browse/SPARK-48334
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.3
>            Reporter: IsisPolei
>            Priority: Critical
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>
> When obtaining a SparkContext instance using SparkContext.getOrCreate(), if 
> an exception occurs during initialization (such as using incorrect Spark 
> parameters, e.g., spark.executor.memory=1 without units), the RpcServer 
> started during this period will not be shut down, resulting in the port being 
> occupied indefinitely.
> The action to close the RpcServer happens in _env.stop(), where 
> rpcEnv.shutdown() is executed, but this action only occurs when _env != null 
> (SparkContext.scala:2106, version 3.1.3). However, the error occurs during 
> initialization, and _env is not instantiated, so _env.stop() will not be 
> executed, leading to the RpcServer not being closed.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to