mridulm commented on code in PR #49292:
URL: https://github.com/apache/spark/pull/49292#discussion_r2051572186


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -721,6 +723,15 @@ class SparkContext(config: SparkConf) extends Logging {
   } catch {
     case NonFatal(e) =>
       logError("Error initializing SparkContext.", e)
+      if(_env == null && envCreated != null) {

Review Comment:
   Agree with @HyukjinKwon.
   The code, as formulated, does not do what you are describing @soumasish.
   `envCreated` is assigned *after* `createSparkEnv` returns - and immediately 
is assigned to `_env`.
   So `envCreated` is always same as `_env`.
   
   If the concern is initialization of individual components of SparkEnv can 
fail (which can happen), we will need to handle it in `SparkEnv.create`.
   Surround it in try/finally - and stop each component which was successfully 
initialized when the method is going to throw exception.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to