+1 



Ruifeng Zheng
ruife...@foxmail.com



 




------------------ Original ------------------
From:                                                                           
                                             "Wenchen Fan"                      
                                                              
<cloud0...@gmail.com&gt;;
Date:&nbsp;Thu, Nov 17, 2022 10:26 AM
To:&nbsp;"Yang,Jie(INF)"<yangji...@baidu.com&gt;;
Cc:&nbsp;"Chris Nauroth"<cnaur...@apache.org&gt;;"Yuming 
Wang"<wgy...@gmail.com&gt;;"Dongjoon Hyun"<dongjoon.h...@gmail.com&gt;;"huaxin 
gao"<huaxin.ga...@gmail.com&gt;;"L. C. Hsieh"<vii...@gmail.com&gt;;"Chao 
Sun"<sunc...@apache.org&gt;;"dev"<dev@spark.apache.org&gt;;
Subject:&nbsp;Re: [VOTE] Release Spark 3.2.3 (RC1)



+1


On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF) <yangji...@baidu.com&gt; wrote:

   
+1,non-binding
 
&nbsp;
 
The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has 
passed.
 
&nbsp;
 
Yang Jie
 
 
 
&nbsp;
  
??????: Chris Nauroth <cnaur...@apache.org&gt;
 ????: 2022??11??17?? ?????? 04:27
 ??????: Yuming Wang <wgy...@gmail.com&gt;
 ????: "Yang,Jie(INF)" <yangji...@baidu.com&gt;, Dongjoon Hyun 
<dongjoon.h...@gmail.com&gt;, huaxin gao <huaxin.ga...@gmail.com&gt;, "L. C. 
Hsieh" <vii...@gmail.com&gt;, Chao Sun <sunc...@apache.org&gt;,  dev 
<dev@spark.apache.org&gt;
 ????: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
+1 (non-binding)
 
 * Verified all checksums.
 * Verified all signatures.
 * Built from source, with multiple profiles, to full success, for Java 11 and 
Scala 2.12:
 &nbsp; &nbsp; * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 
-Phive-thriftserver -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean 
package
 * Tests passed.
 * Ran several examples successfully:
 &nbsp; &nbsp; * bin/spark-submit --class org.apache.spark.examples.SparkPi 
examples/jars/spark-examples_2.12-3.2.3.jar
 &nbsp; &nbsp; * bin/spark-submit --class 
org.apache.spark.examples.sql.hive.SparkHiveExample 
examples/jars/spark-examples_2.12-3.2.3.jar
 &nbsp; &nbsp; * bin/spark-submit 
examples/src/main/python/streaming/network_wordcount.py localhost 9999
      
&nbsp;
 
  
Chao, thank you for preparing the release.
 
  
&nbsp;
 
  
Chris Nauroth
 
 
 
 
 
 
&nbsp;
 
 
&nbsp;
   
On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang <wgy...@gmail.com&gt; wrote:
 
   
+1
 
 
&nbsp;
   
On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) <yangji...@baidu.com&gt; wrote:
 
     
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it 
has not been hung.
 
&nbsp;
 
Yang Jie
 
&nbsp;
  
??????: Dongjoon Hyun <dongjoon.h...@gmail.com&gt;
 ????: 2022??11??16?? ?????? 01:17
 ??????: "Yang,Jie(INF)" <yangji...@baidu.com&gt;
 ????: huaxin gao <huaxin.ga...@gmail.com&gt;, "L. C. Hsieh" 
<vii...@gmail.com&gt;,  Chao Sun <sunc...@apache.org&gt;, dev 
<dev@spark.apache.org&gt;
 ????: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
Did you hit that in Scala 2.12, too? 
  
&nbsp;
 
  
Dongjoon.
 
 
 
&nbsp;
   
On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) <yangji...@baidu.com&gt; wrote:
 
     
Hi, all
 
&nbsp;
 
I test v3.2.3 with following command:
 
&nbsp;
 
```
 
dev/change-scala-version.sh 2.13
 
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl 
-Phive-thriftserver -Pspark-ganglia-lgpl  -Pkubernetes -Phive&nbsp; 
-Pscala-2.13 -fn
 
```
 
&nbsp;
 
The testing environment is:
 
&nbsp;
 
OS: CentOS 6u3 Final
 
Java: zulu 11.0.17
 
Python: 3.9.7
 
Scala: 2.13
 
&nbsp;
 
The above test command has been executed twice, and all times hang in the 
following stack:
 
&nbsp;
 
```
 
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms 
elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132  waiting on condition&nbsp; 
[0x00007f2de3929000]
 
&nbsp;&nbsp; java.lang.Thread.State: WAITING (parking)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - parking to wait for&nbsp; 
<0x0000000790d00050&gt; (a 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
  Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d00208&gt; (a 
java.lang.Object)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d00218&gt; (a 
org.apache.spark.sql.execution.QueryExecution)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; - locked <0x0000000790d002d8&gt; (a 
org.apache.spark.sql.Dataset)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Transformer.apply(Transformer.scala:22)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Transformer.apply(Transformer.scala:20)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
scala.collection.immutable.List.foreach(List.scala:333)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.run(Suite.scala:1112)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.run$(Suite.scala:1094)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
 Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.SuperEngine.runImpl(Engine.scala:535)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at  
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.run(Suite.scala:1109)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.Suite.run$(Suite.scala:1094)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown 
Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
scala.collection.immutable.List.foreach(List.scala:333)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown Source)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner$.main(Runner.scala:775)
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at 
org.scalatest.tools.Runner.main(Runner.scala)
 
```
 
I think the test case being executed is `SPARK-28323: PythonUDF should be able 
to use in join condition`, does anyone  have the same problem?
 
&nbsp;
 
Yang Jie
 
&nbsp;
 
&nbsp;
  
??????: huaxin gao <huaxin.ga...@gmail.com&gt;
 ????: 2022??11??15?? ?????? 13:59
 ??????: "L. C. Hsieh" <vii...@gmail.com&gt;
 ????: Dongjoon Hyun <dongjoon.h...@gmail.com&gt;, Chao Sun 
<sunc...@apache.org&gt;,  dev <dev@spark.apache.org&gt;
 ????: Re: [VOTE] Release Spark 3.2.3 (RC1)
 
  
&nbsp;
 
  
+1&nbsp;
  
&nbsp;
 
  
Thanks Chao!
 
 
 
&nbsp;
   
On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vii...@gmail.com&gt; wrote:
 
  
+1
 
 Thanks Chao.
 
 On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <dongjoon.h...@gmail.com&gt; 
wrote:
 &gt;
 &gt; +1
 &gt;
 &gt; Thank you, Chao.
 &gt;
 &gt; On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <sunc...@apache.org&gt; wrote:
 &gt;&gt;
 &gt;&gt; Please vote on releasing the following candidate as Apache Spark 
version 3.2.3.
 &gt;&gt;
 &gt;&gt; The vote is open until 11:59pm Pacific time Nov 17th and passes if a
 &gt;&gt; majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
 &gt;&gt;
 &gt;&gt; [ ] +1 Release this package as Apache Spark 3.2.3
 &gt;&gt; [ ] -1 Do not release this package because ...
 &gt;&gt;
 &gt;&gt; To learn more about Apache Spark, please see  http://spark.apache.org/
 &gt;&gt;
 &gt;&gt; The tag to be voted on is v3.2.3-rc1 (commit
 &gt;&gt; b53c341e0fefbb33d115ab630369a18765b7763d):
 &gt;&gt;  https://github.com/apache/spark/tree/v3.2.3-rc1
 &gt;&gt;
 &gt;&gt; The release files, including signatures, digests, etc. can be found 
at:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
 &gt;&gt;
 &gt;&gt; Signatures used for Spark RCs can be found in this file:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/KEYS
 &gt;&gt;
 &gt;&gt; The staging repository for this release can be found at:
 &gt;&gt;  
https://repository.apache.org/content/repositories/orgapachespark-1431/
 &gt;&gt;
 &gt;&gt; The documentation corresponding to this release can be found at:
 &gt;&gt;  https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
 &gt;&gt;
 &gt;&gt; The list of bug fixes going into 3.2.3 can be found at the following 
URL:
 &gt;&gt;  https://issues.apache.org/jira/projects/SPARK/versions/12352105
 &gt;&gt;
 &gt;&gt; This release is using the release script of the tag v3.2.3-rc1.
 &gt;&gt;
 &gt;&gt;
 &gt;&gt; FAQ
 &gt;&gt;
 &gt;&gt; =========================
 &gt;&gt; How can I help test this release?
 &gt;&gt; =========================
 &gt;&gt; If you are a Spark user, you can help us test this release by taking
 &gt;&gt; an existing Spark workload and running on this release candidate, then
 &gt;&gt; reporting any regressions.
 &gt;&gt;
 &gt;&gt; If you're working in PySpark you can set up a virtual env and install
 &gt;&gt; the current RC and see if anything important breaks, in the Java/Scala
 &gt;&gt; you can add the staging repository to your projects resolvers and test
 &gt;&gt; with the RC (make sure to clean up the artifact cache before/after so
 &gt;&gt; you don't end up building with a out of date RC going forward).
 &gt;&gt;
 &gt;&gt; ===========================================
 &gt;&gt; What should happen to JIRA tickets still targeting 3.2.3?
 &gt;&gt; ===========================================
 &gt;&gt; The current list of open tickets targeted at 3.2.3 can be found at:
 &gt;&gt;  https://issues.apache.org/jira/projects/SPARK and search for "Target
 &gt;&gt; Version/s" = 3.2.3
 &gt;&gt;
 &gt;&gt; Committers should look at those and triage. Extremely important bug
 &gt;&gt; fixes, documentation, and API tweaks that impact compatibility should
 &gt;&gt; be worked on immediately. Everything else please retarget to an
 &gt;&gt; appropriate release.
 &gt;&gt;
 &gt;&gt; ==================
 &gt;&gt; But my bug isn't fixed?
 &gt;&gt; ==================
 &gt;&gt; In order to make timely releases, we will typically not hold the
 &gt;&gt; release unless the bug in question is a regression from the previous
 &gt;&gt; release. That being said, if there is something which is a regression
 &gt;&gt; that has not been correctly targeted please ping me or a committer to
 &gt;&gt; help target the issue.
 &gt;&gt;
 &gt;&gt; ---------------------------------------------------------------------
 &gt;&gt; To unsubscribe e-mail:  dev-unsubscr...@spark.apache.org
 &gt;&gt;
 
 ---------------------------------------------------------------------
 To unsubscribe e-mail:  dev-unsubscr...@spark.apache.org

Reply via email to