[ https://issues.apache.org/jira/browse/SPARK-51603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yang Jie resolved SPARK-51603. ------------------------------ Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 50396 [https://github.com/apache/spark/pull/50396] > logical circular dependency between modules `connect-client-jvm` and > `assembly` > ------------------------------------------------------------------------------- > > Key: SPARK-51603 > URL: https://issues.apache.org/jira/browse/SPARK-51603 > Project: Spark > Issue Type: Bug > Components: Connect, Tests > Affects Versions: 4.0.0 > Reporter: Yang Jie > Assignee: Yang Jie > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > After SPARK-48936, the `assembly` module is compiled prior to > `connect-client-jvm` to facilitate the packaging output. However, tests > related to RemoteSparkSession in the `connect-client-jvm` module depend on > the packaged results from the `assembly` module, specifically the JARs > located in `assembly/target/scala-2.13/jars/`. Since the latter is a > convention, executing > > {code:java} > build/mvn clean install -DskipTests {code} > > will not produce any errors. But if we directly execute > {code:java} > build/mvn clean install {code} > we will encounter test errors similar to the following when testing the > `connect-client-jvm` module: > > {code:java} > [ERROR] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.044 > s <<< FAILURE! – in org.apache.spark.sql.JavaEncoderSuite > [ERROR] org.apache.spark.sql.JavaEncoderSuite – Time elapsed: 0.044 s <<< > FAILURE! > java.lang.AssertionError: assertion failed: Fail to locate the target folder: > '/Users/yangjie01/SourceCode/git/spark-maven/sql/connect/server/target'. > SPARK_HOME='/Users/yangjie01/SourceCode/git/spark-maven'. Make sure the spark > project jars has been built (e.g. using build/sbt package)and the env > variable `SPARK_HOME` is set correctly. > at scala.Predef$.assert(Predef.scala:279) > at > org.apache.spark.sql.connect.test.IntegrationTestUtils$.tryFindJar(IntegrationTestUtils.scala:138) > at > org.apache.spark.sql.connect.test.IntegrationTestUtils$.findJar(IntegrationTestUtils.scala:116) > at > org.apache.spark.sql.connect.test.SparkConnectServerUtils$.sparkConnect$lzycompute(RemoteSparkSession.scala:65) > at > org.apache.spark.sql.connect.test.SparkConnectServerUtils$.sparkConnect(RemoteSparkSession.scala:62) > at > org.apache.spark.sql.connect.test.SparkConnectServerUtils$.start(RemoteSparkSession.scala:135) > at > org.apache.spark.sql.connect.test.SparkConnectServerUtils$.createSparkSession(RemoteSparkSession.scala:181) > at > org.apache.spark.sql.connect.test.SparkConnectServerUtils.createSparkSession(RemoteSparkSession.scala) > at org.apache.spark.sql.JavaEncoderSuite.setup(JavaEncoderSuite.java:42) > at java.base/java.lang.reflect.Method.invoke(Method.java:569) > at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) > Suppressed: java.lang.NullPointerException: Cannot invoke > "org.apache.spark.sql.SparkSession.stop()" because > "org.apache.spark.sql.JavaEncoderSuite.spark" is null > at > org.apache.spark.sql.JavaEncoderSuite.tearDown(JavaEncoderSuite.java:47) > at java.base/java.lang.reflect.Method.invoke(Method.java:569) > at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) > at > java.base/java.util.Collections$UnmodifiableCollection.forEach(Collections.java:1092) > ... 1 more > [INFO] > [INFO] Results: > [INFO] > [ERROR] Failures: > [ERROR] JavaEncoderSuite.setup:42 assertion failed: Fail to locate the > target folder: > '/Users/yangjie01/SourceCode/git/spark-maven/sql/connect/server/target'. > SPARK_HOME='/Users/yangjie01/SourceCode/git/spark-maven'. Make sure the spark > project jars has been built (e.g. using build/sbt package)and the env > variable `SPARK_HOME` is set correctly. > [INFO] > [ERROR] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0 {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org