soumasish opened a new pull request, #49292:
URL: https://github.com/apache/spark/pull/49292

   ### What changes were proposed in this pull request?
   If a SparkContext fails during initialization (for example, due to invalid 
Spark configs), the driver’s RpcEnv may have already started but never gets 
stopped, because SparkContext’s _env field is still null and _env.stop() is not 
invoked in the catch block. As a result, the RPC server port remains bound 
indefinitely in that process. This PR adds logic to ensure that if the RPC 
server is partially created and a SparkContext constructor error occurs, we 
properly shut down the RpcEnv so it does not remain open.
   
   ### Why are the changes needed?
   - Fixes a resource leak where an RPC port remains bound if SparkContext’s 
initialization fails.
   - Ensures consistent cleanup in error scenarios, preventing indefinite port 
binding for a failed SparkContext.
   
   ### Does this PR introduce _any_ user-facing change?
   No. Internal fix only; no visible API changes.
   
   ### How was this patch tested?
   - WIP
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to