wbo4958 commented on PR #50334:
URL: https://github.com/apache/spark/pull/50334#issuecomment-2765018988

   > Thanks for the explanation and changes @wbo4958! LGTM for the most part.
   > 
   > As a follow-up idea for later, I'd like to explore to see if adding a 
second "user default" session state would make sense. The idea is broadly that 
a "cluster default" session would contain jars added during cluster-bootup (i.e 
through `--jars `) whose children sessions would be the Spark Connect isolated 
sessions as well as a "user default" session (i.e any jars added directly by a 
user manipulating SparkContext directly without going through Spark Connect). 
These child sessions would inherit the "cluster default" classloader and ensure 
isolation between classic Spark sessions and Spark Connect sessions. cc 
@HyukjinKwon @hvanhovell
   > 
   > ```
   >          +-------------------------------------+
   >          |         Cluster Default             |
   >          |       ClassLoader (bootup jars)     |
   >          +----------------+----------------------+
   >                           |
   >          +----------------+----------------+
   >          |                                 |
   > +-------------------------+    +-------------------------+
   > |  Spark Connect          |    |   User Default          |
   > |  Isolated Session       |    |   Session (direct jars) |
   > |  (inherits cluster      |    |   (inherits cluster     |
   > |   default classloader)  |    |    default classloader) |
   > +-------------------------+    +-------------------------+
   > ```
   
   Hmm, Looks like this makes things complicated. Looks like we couldn't handle 
SparkContext in the Spark Connect environment no matter it's in the client or 
server side?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to