[ 
https://issues.apache.org/jira/browse/SPARK-51416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-51416.
----------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 50180
[https://github.com/apache/spark/pull/50180]

> Remove SPARK_CONNECT_MODE when starting Spark Connect server
> ------------------------------------------------------------
>
>                 Key: SPARK-51416
>                 URL: https://issues.apache.org/jira/browse/SPARK-51416
>             Project: Spark
>          Issue Type: Bug
>          Components: Connect
>    Affects Versions: 4.0.0
>            Reporter: Hyukjin Kwon
>            Assignee: Hyukjin Kwon
>            Priority: Blocker
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>
> {code}
> SPARK_CONNECT_MODE=1 ./bin/pyspark
> {code}
> fails as below:
> {code}
> py4j.protocol.Py4JJavaError: An error occurred while calling 
> None.org.apache.spark.api.java.JavaSparkContext.
> : java.lang.ClassNotFoundException: 
> org.apache.spark.sql.connect.SparkConnectPlugin
>       at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:445)
>       at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:592)
>       at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
>       at java.base/java.lang.Class.forName0(Native Method)
>       at java.base/java.lang.Class.forName(Class.java:467)
>       at 
> org.apache.spark.util.SparkClassUtils.classForName(SparkClassUtils.scala:41)
>       at 
> org.apache.spark.util.SparkClassUtils.classForName$(SparkClassUtils.scala:36)
>       at org.apache.spark.util.Utils$.classForName(Utils.scala:99)
>       at 
> org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2828)
>       at 
> scala.collection.StrictOptimizedIterableOps.flatMap(StrictOptimizedIterableOps.scala:118)
>       at 
> scala.collection.StrictOptimizedIterableOps.flatMap$(StrictOptimizedIterableOps.scala:105)
>       at scala.collection.immutable.ArraySeq.flatMap(ArraySeq.scala:35)
>       at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2826)
>       at 
> org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:210)
>       at 
> org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:196)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:588)
>       at 
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
>       at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>  Method)
>       at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
>       at 
> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at 
> java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
>       at 
> java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:481)
>       at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
>       at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
>       at py4j.Gateway.invoke(Gateway.java:238)
>       at 
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
>       at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
>       at 
> py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:184)
>       at py4j.ClientServerConnection.run(ClientServerConnection.java:108)
>       at java.base/java.lang.Thread.run(Thread.java:840)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to