mridulm commented on code in PR #49292:
URL: https://github.com/apache/spark/pull/49292#discussion_r2051572186


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -721,6 +723,15 @@ class SparkContext(config: SparkConf) extends Logging {
   } catch {
     case NonFatal(e) =>
       logError("Error initializing SparkContext.", e)
+      if(_env == null && envCreated != null) {

Review Comment:
   Agree with @HyukjinKwon.
   The code, as formulated, does not do what you are describing @soumasish.
   `envCreated` is assigned *after* `createSparkEnv` returns - and immediately 
is assigned to `_env`.
   So `envCreated` is always same as `_env`.
   
   If the concern is initialization of individual components of SparkEnv can 
fail & throw exception (which can happen), and so SparkEnv/SparkContext is 
never initialized - but leave a few of the subsystems initialized; we will need 
to handle it in `SparkEnv.create` itself.
   Surround it in try/finally - and stop each component which was successfully 
initialized when the method is going to throw exception.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to