davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1925131311
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecutionContext.scala:
##
@@ -50,12 +51,24 @@ class SqlScriptingExecutionContext {
throw
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1925132255
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecutionContext.scala:
##
@@ -111,15 +124,32 @@ class SqlScriptingExecutionFrame(
}
}
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1925131311
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecutionContext.scala:
##
@@ -50,12 +51,24 @@ class SqlScriptingExecutionContext {
throw
MaxGekk commented on code in PR #49598:
URL: https://github.com/apache/spark/pull/49598#discussion_r1925003554
##
common/utils/src/main/resources/error/error-conditions.json:
##
@@ -1229,6 +1229,12 @@
],
"sqlState" : "42710"
},
+ "DUPLICATED_ARTIFACT" : {
Review C
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1925099480
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecutionContext.scala:
##
@@ -27,7 +27,8 @@ import
org.apache.spark.sql.scripting.SqlScripting
yaooqinn commented on code in PR #49602:
URL: https://github.com/apache/spark/pull/49602#discussion_r1925101343
##
sql/api/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:
##
@@ -16,7 +16,7 @@
*/
package org.apache.spark.sql
-import java.util
+import java.util.{Lo
HyukjinKwon commented on PR #49602:
URL: https://github.com/apache/spark/pull/49602#issuecomment-2606966479
I used a little bit diff approach :-). Thanks for comments.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use th
HyukjinKwon commented on code in PR #49602:
URL: https://github.com/apache/spark/pull/49602#discussion_r1925146776
##
sql/api/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:
##
@@ -139,7 +139,7 @@ abstract class DataFrameWriter[T] {
*
* @since 1.4.0
*/
-
LuciferYang commented on PR #49599:
URL: https://github.com/apache/spark/pull/49599#issuecomment-2607633085
Thanks @MaxGekk
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
LuciferYang commented on PR #49592:
URL: https://github.com/apache/spark/pull/49592#issuecomment-2607641305
>
 -> "SparkContext":
raise RuntimeError("Accessing SparkContext is not supported o
cloud-fan commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925348974
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CTESubstitution.scala:
##
@@ -151,7 +157,7 @@ object CTESubstitution extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925339756
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
cloud-fan commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925351425
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
cloud-fan commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925352139
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925363913
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CTESubstitution.scala:
##
@@ -151,7 +157,7 @@ object CTESubstitution extends Rule[LogicalPlan]
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925606130
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925610123
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925616520
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala:
##
@@ -1042,6 +1043,75 @@ trait CheckAnalysis extends PredicateHelper with
dongjoon-hyun commented on PR #49599:
URL: https://github.com/apache/spark/pull/49599#issuecomment-2607713012
Thank you, @LuciferYang and @MaxGekk .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
LuciferYang commented on code in PR #49599:
URL: https://github.com/apache/spark/pull/49599#discussion_r1925139317
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -526,10 +526,18 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925274283
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925274874
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CTESubstitution.scala:
##
@@ -423,4 +429,20 @@ object CTESubstitution extends Rule[LogicalPlan]
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925279434
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925280827
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -183,4 +184,52 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925291147
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/cteOperators.scala:
##
@@ -100,12 +100,14 @@ case class CTERelationDef(
override def
milanisvet commented on code in PR #49518:
URL: https://github.com/apache/spark/pull/49518#discussion_r1925294587
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CTESubstitution.scala:
##
@@ -151,7 +157,7 @@ object CTESubstitution extends Rule[LogicalPlan]
LuciferYang commented on code in PR #49602:
URL: https://github.com/apache/spark/pull/49602#discussion_r1925193660
##
sql/api/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:
##
@@ -139,7 +139,7 @@ abstract class DataFrameWriter[T] {
*
* @since 1.4.0
*/
-
zhengruifeng commented on PR #49600:
URL: https://github.com/apache/spark/pull/49600#issuecomment-2607079366
merged to master/4.0
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comm
zhengruifeng closed pull request #49600: [SPARK-50948][ML][PYTHON][CONNECT] Add
support for StringIndexer/PCA on Connect
URL: https://github.com/apache/spark/pull/49600
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
MaxGekk commented on PR #49599:
URL: https://github.com/apache/spark/pull/49599#issuecomment-2607548264
+1, LGTM. Merging to master/4.0.
Thank you, @LuciferYang.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
MaxGekk closed pull request #49599: [SPARK-50946][CORE][TESTS] Add version
check for Java 17.0.14 to make `UtilsSuite` test pass
URL: https://github.com/apache/spark/pull/49599
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
dusantism-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925532381
##
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogManager.scala:
##
@@ -49,6 +49,18 @@ class CatalogManager(
// TODO: create a real S
LuciferYang commented on code in PR #49601:
URL: https://github.com/apache/spark/pull/49601#discussion_r1925237252
##
sql/connect/server/src/main/scala/org/apache/spark/sql/connect/ml/MLUtils.scala:
##
@@ -448,98 +454,149 @@ private[ml] object MLUtils {
// Since we're using r
zhengruifeng commented on code in PR #49601:
URL: https://github.com/apache/spark/pull/49601#discussion_r1925254904
##
sql/connect/server/src/main/scala/org/apache/spark/sql/connect/ml/MLUtils.scala:
##
@@ -448,98 +454,149 @@ private[ml] object MLUtils {
// Since we're using
MaxGekk commented on code in PR #49599:
URL: https://github.com/apache/spark/pull/49599#discussion_r1925067453
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -526,10 +526,18 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
// T
cloud-fan commented on code in PR #48818:
URL: https://github.com/apache/spark/pull/48818#discussion_r1925413987
##
sql/api/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##
@@ -776,40 +779,161 @@ abstract class SparkSession extends Serializable with
Closeable {
*
cloud-fan commented on code in PR #48818:
URL: https://github.com/apache/spark/pull/48818#discussion_r1925409046
##
project/MimaExcludes.scala:
##
@@ -205,10 +205,30 @@ object MimaExcludes {
// SPARK-50112: Moving avro files from connector to sql/core
ProblemFilters.ex
cloud-fan commented on PR #49536:
URL: https://github.com/apache/spark/pull/49536#issuecomment-2607406371
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific c
cloud-fan closed pull request #49536: [SPARK-49646][SQL] add spark config for
fixing subquery decorrelation for union/set operations when
parentOuterReferences has references not covered in
collectedChildOuterReferences
URL: https://github.com/apache/spark/pull/49536
--
This is an automated
LuciferYang commented on PR #48611:
URL: https://github.com/apache/spark/pull/48611#issuecomment-2606699186
> javax
If time permits, I suggest we try to migrate them to jakarta before Spark
4.0 release
--
This is an automated message from the Apache Git Service.
To respond
stefankandic opened a new pull request, #49603:
URL: https://github.com/apache/spark/pull/49603
Creating a new PR for #49576 into a newly cut 4.0 branch.
### What changes were proposed in this pull request?
Introducing a new interface `DefaultStringProducingExpression` which should
HyukjinKwon opened a new pull request, #49602:
URL: https://github.com/apache/spark/pull/49602
### What changes were proposed in this pull request?
This PR avoids importing java.util at DataFrameWriter.
### Why are the changes needed?
Using `util` in the codebase is confu
zhengruifeng commented on code in PR #49601:
URL: https://github.com/apache/spark/pull/49601#discussion_r1925372177
##
sql/connect/server/src/test/scala/org/apache/spark/sql/connect/ml/MLHelper.scala:
##
@@ -298,7 +299,7 @@ class MyLogisticRegressionModel(
override val uid:
zhengruifeng commented on code in PR #49601:
URL: https://github.com/apache/spark/pull/49601#discussion_r1925370678
##
sql/connect/server/src/main/scala/org/apache/spark/sql/connect/ml/MLUtils.scala:
##
@@ -448,98 +454,149 @@ private[ml] object MLUtils {
// Since we're using
vicennial opened a new pull request, #49604:
URL: https://github.com/apache/spark/pull/49604
### What changes were proposed in this pull request?
This PR adds a sample project, `server-library-example` (under a new
directory `connect-examples`) to demonstrate the workings of u
cloud-fan commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925386521
##
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogManager.scala:
##
@@ -49,6 +49,18 @@ class CatalogManager(
// TODO: create a real SYST
vladimirg-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925392103
##
sql/catalyst/src/main/scala/org/apache/spark/sql/connector/catalog/CatalogManager.scala:
##
@@ -49,6 +49,18 @@ class CatalogManager(
// TODO: create a real S
dongjoon-hyun commented on PR #49599:
URL: https://github.com/apache/spark/pull/49599#issuecomment-2607766964
I backported this to `branch-3.5` too.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
milanisvet commented on code in PR #49571:
URL: https://github.com/apache/spark/pull/49571#discussion_r1925763851
##
sql/core/src/main/scala/org/apache/spark/sql/execution/basicPhysicalOperators.scala:
##
@@ -714,6 +718,147 @@ case class UnionExec(children: Seq[SparkPlan]) exten
dongjoon-hyun commented on code in PR #49606:
URL: https://github.com/apache/spark/pull/49606#discussion_r1925762757
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -525,10 +525,14 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
LuciferYang commented on code in PR #49606:
URL: https://github.com/apache/spark/pull/49606#discussion_r1925760510
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -525,10 +525,14 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
jingz-db commented on PR #49156:
URL: https://github.com/apache/spark/pull/49156#issuecomment-2607941901
Sorry @HyukjinKwon, what is old dependencies you are referring to here? I
think this suite is flaky because of timing issue. I'll take the ticket and fix
the flakiness soon.
--
This i
harshmotw-db commented on code in PR #49591:
URL: https://github.com/apache/spark/pull/49591#discussion_r1925867531
##
python/pyspark/sql/types.py:
##
@@ -1478,6 +1478,9 @@ def toInternal(self, obj: Tuple) -> Tuple:
if obj is None:
return
+if isin
davidm-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925883386
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecution.scala:
##
@@ -55,7 +55,15 @@ class SqlScriptingExecution(
}
private val variab
davidm-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925889523
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecution.scala:
##
@@ -55,7 +55,15 @@ class SqlScriptingExecution(
}
private val variab
xinrong-meng commented on PR #49472:
URL: https://github.com/apache/spark/pull/49472#issuecomment-2608154024
Thank you @ueshin for fixing that!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun commented on PR #49488:
URL: https://github.com/apache/spark/pull/49488#issuecomment-2608157455
> @jingz-db - lets update the PR description to mention that this only
covers support in Scala. Thx
Just a side question. Do you have some Spark committers in your mind to get
davidm-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1925894712
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SqlScriptingVariableManager.scala:
##
@@ -0,0 +1,25 @@
+/*
+ * Licensed to the Apache Software Foundation
anishshri-db commented on PR #49488:
URL: https://github.com/apache/spark/pull/49488#issuecomment-2608160079
> Do you have some Spark committers in your mind to get reviews for this PR?
Yes - cc - @HeartSaVioR - PTAL also, thx !
--
This is an automated message from the Apache Git Se
dongjoon-hyun commented on PR #49606:
URL: https://github.com/apache/spark/pull/49606#issuecomment-2608165800
Merged to branch-3.5.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific co
dongjoon-hyun commented on PR #49606:
URL: https://github.com/apache/spark/pull/49606#issuecomment-2607967237
Thank you, @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specifi
dongjoon-hyun closed pull request #49606: [SPARK-50946][CORE][TESTS][3.5] Add
version check for Java 17.0.14 to make `UtilsSuite` test pass
URL: https://github.com/apache/spark/pull/49606
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
dongjoon-hyun opened a new pull request, #49608:
URL: https://github.com/apache/spark/pull/49608
…
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
LuciferYang commented on code in PR #49606:
URL: https://github.com/apache/spark/pull/49606#discussion_r1925760510
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -525,10 +525,14 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
dongjoon-hyun commented on PR #49482:
URL: https://github.com/apache/spark/pull/49482#issuecomment-2608097024
Gentle ping, @xinrong-meng .
If this is targeting Apache Spark 4.0, we had better have this before
February 1st.
- https://spark.apache.org/versioning-policy.html
--
Thi
dongjoon-hyun commented on PR #49606:
URL: https://github.com/apache/spark/pull/49606#issuecomment-2608087924
Thank you, @MaxGekk
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific com
dongjoon-hyun commented on PR #49488:
URL: https://github.com/apache/spark/pull/49488#issuecomment-2608106885
cc @hvanhovell
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
anishshri-db commented on PR #49488:
URL: https://github.com/apache/spark/pull/49488#issuecomment-2608109177
@jingz-db - lets update the PR description to mention that this only covers
support in Scala. Thx
--
This is an automated message from the Apache Git Service.
To respond to the mes
dongjoon-hyun closed pull request #49605: [SPARK-50951][BUILD][TESTS] Update
Oracle free version from 23.5 to 23.6
URL: https://github.com/apache/spark/pull/49605
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL ab
milanisvet commented on code in PR #49571:
URL: https://github.com/apache/spark/pull/49571#discussion_r1925831054
##
sql/core/src/main/scala/org/apache/spark/sql/execution/basicPhysicalOperators.scala:
##
@@ -714,6 +718,147 @@ case class UnionExec(children: Seq[SparkPlan]) exten
dongjoon-hyun closed pull request #49608: [SPARK-50952][BUILD] Include
`jjwt`-related libraries and provide `jjwt-provided` profile
URL: https://github.com/apache/spark/pull/49608
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub a
dongjoon-hyun commented on PR #49608:
URL: https://github.com/apache/spark/pull/49608#issuecomment-2608358968
Merged to master/4.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific co
miland-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1926044733
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingInterpreter.scala:
##
@@ -63,6 +67,79 @@ case class SqlScriptingInterpreter(session: SparkSessio
miland-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1926048316
##
sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4:
##
@@ -79,6 +81,29 @@ setStatementWithOptionalVarKeyword
LEFT_PAREN query R
dongjoon-hyun opened a new pull request, #49606:
URL: https://github.com/apache/spark/pull/49606
### What changes were proposed in this pull request?
This pr adds Java version checks and logic adaptations to `scenario6` in the
test case named "SPARK-35907: createDirectory" within the `Uti
dongjoon-hyun commented on PR #49599:
URL: https://github.com/apache/spark/pull/49599#issuecomment-2607889075
I made a backporting PR to branch-3.5 by replacing `Runtime.Version` (Java
9+)
- https://github.com/apache/spark/pull/49606
--
This is an automated message from the Apache Git
milanisvet commented on code in PR #49571:
URL: https://github.com/apache/spark/pull/49571#discussion_r1925698021
##
sql/core/src/main/scala/org/apache/spark/sql/execution/basicPhysicalOperators.scala:
##
@@ -714,6 +718,147 @@ case class UnionExec(children: Seq[SparkPlan]) exten
milanisvet commented on code in PR #49571:
URL: https://github.com/apache/spark/pull/49571#discussion_r1925709323
##
sql/core/src/main/scala/org/apache/spark/sql/execution/basicPhysicalOperators.scala:
##
@@ -714,6 +718,147 @@ case class UnionExec(children: Seq[SparkPlan]) exten
milanisvet commented on code in PR #49571:
URL: https://github.com/apache/spark/pull/49571#discussion_r1925710464
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -4520,6 +4520,31 @@ object SQLConf {
.checkValues(LegacyBehaviorPolicy.values.
stefankandic opened a new pull request, #49607:
URL: https://github.com/apache/spark/pull/49607
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### Ho
dongjoon-hyun commented on PR #49606:
URL: https://github.com/apache/spark/pull/49606#issuecomment-2607887517
cc @LuciferYang and @MaxGekk
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spe
wengh commented on code in PR #49535:
URL: https://github.com/apache/spark/pull/49535#discussion_r1925772564
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3459,6 +3459,15 @@ object SQLConf {
.checkValues(Set("legacy", "row", "dict"))
dongjoon-hyun commented on PR #49606:
URL: https://github.com/apache/spark/pull/49606#issuecomment-2607958939
Here is the manual test.
```
$ java -version
openjdk version "1.8.0_312"
OpenJDK Runtime Environment AppleJDK-8.0.312.7.1 (build 1.8.0_312-b07)
OpenJDK 64-Bit Server VM
LucaCanali commented on PR #49605:
URL: https://github.com/apache/spark/pull/49605#issuecomment-2608217614
Thank you @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific c
dongjoon-hyun commented on PR #49608:
URL: https://github.com/apache/spark/pull/49608#issuecomment-2608241619
Could you review this when you have some time, @viirya ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
HyukjinKwon commented on PR #48665:
URL: https://github.com/apache/spark/pull/48665#issuecomment-2608525172
I am debugging the flakiness, and seems like this causes the tests flaky for
some reasons. I will revert this first.
--
This is an automated message from the Apache Git Service.
To
anishshri-db commented on PR #49488:
URL: https://github.com/apache/spark/pull/49488#issuecomment-2608542409
LGTM pending green CI
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific com
wbo4958 commented on code in PR #49601:
URL: https://github.com/apache/spark/pull/49601#discussion_r1926192056
##
sql/connect/server/src/main/scala/org/apache/spark/sql/connect/ml/MLUtils.scala:
##
@@ -448,98 +454,149 @@ private[ml] object MLUtils {
// Since we're using refle
wengh commented on code in PR #49535:
URL: https://github.com/apache/spark/pull/49535#discussion_r1925772564
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3459,6 +3459,15 @@ object SQLConf {
.checkValues(Set("legacy", "row", "dict"))
wengh commented on code in PR #49535:
URL: https://github.com/apache/spark/pull/49535#discussion_r1926225302
##
python/pyspark/util.py:
##
@@ -468,16 +468,19 @@ def handle_worker_exception(e: BaseException, outfile:
IO) -> None:
and exception traceback info to outfile. JVM
HyukjinKwon commented on PR #49591:
URL: https://github.com/apache/spark/pull/49591#issuecomment-2608654196
Merged to master and branch-4.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
HyukjinKwon closed pull request #49591: [SPARK-50815][FOLLOW-UP] Handle
Variant-related edge-case for createDataFrame
URL: https://github.com/apache/spark/pull/49591
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
LuciferYang commented on code in PR #49606:
URL: https://github.com/apache/spark/pull/49606#discussion_r1925746802
##
core/src/test/scala/org/apache/spark/util/UtilsSuite.scala:
##
@@ -525,10 +525,14 @@ class UtilsSuite extends SparkFunSuite with
ResetSystemProperties {
ueshin commented on PR #49592:
URL: https://github.com/apache/spark/pull/49592#issuecomment-2608006451
@LuciferYang Thanks for the fix! Let me merge it and rerun tests.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use t
dongjoon-hyun commented on PR #49501:
URL: https://github.com/apache/spark/pull/49501#issuecomment-2608007898
Gentle ping, @Ngone51 ~
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #49602:
URL: https://github.com/apache/spark/pull/49602#issuecomment-2608421968
ah it's done in a different way. I will fix the linter and merge. Thanks
guys!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
ctring commented on code in PR #49559:
URL: https://github.com/apache/spark/pull/49559#discussion_r1926074852
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/v2AlterTableCommands.scala:
##
@@ -201,45 +201,52 @@ case class RenameColumn(
copy(table
HyukjinKwon closed pull request #49602: [MINOR][SQL] Avoid importing java.util
at DataFrameWriter
URL: https://github.com/apache/spark/pull/49602
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
1 - 100 of 186 matches
Mail list logo