Hello,

We are using spark 3.5.0 and were wondering if the following is achievable
using spark-core

Our use case involves spinning up a spark cluster where the driver
application loads user jars containing spark transformations at runtime. A
single spark application can load multiple user jars ( same cluster ) that
can have class path conflicts if care is not taken

AFAIK, to get this right requires the Executor to be designed in a way that
allows for class path isolation ( UDF, lambda expressions ). Ideally per
Spark Session is what we want

I know Spark connect has been designed this way but Spark connect is not an
option for us at the moment. I had some luck using a private method inside
spark called JobArtifactSet.withActiveJobArtifactState

Is it sufficient for me to run the user code enclosed
within JobArtifactSet.withActiveJobArtifactState to achieve my requirement?

Thank you

Faiz

Reply via email to