ericm-db commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1917878037
##
python/pyspark/sql/tests/pandas/test_pandas_transform_with_state.py:
##
@@ -1294,6 +1307,208 @@ def
test_transform_with_state_with_timers_single_partition(self):
zhengruifeng commented on code in PR #49525:
URL: https://github.com/apache/spark/pull/49525#discussion_r1917858681
##
dev/sparktestsupport/modules.py:
##
@@ -1116,7 +1116,7 @@ def __hash__(self):
"pyspark.ml.tests.connect.test_connect_classification",
"pyspark
zhengruifeng commented on code in PR #49525:
URL: https://github.com/apache/spark/pull/49525#discussion_r1917860839
##
python/pyspark/ml/tests/test_classification.py:
##
@@ -283,6 +293,318 @@ def test_logistic_regression(self):
except OSError:
pass
+d
yaooqinn commented on code in PR #49506:
URL: https://github.com/apache/spark/pull/49506#discussion_r1917855477
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:
##
@@ -1445,7 +1445,10 @@ object HiveExternalCatalog {
case _: AnsiIntervalType =>
zhengruifeng opened a new pull request, #49525:
URL: https://github.com/apache/spark/pull/49525
### What changes were proposed in this pull request?
Support Tree Classifiers in ML Connect
### Why are the changes needed?
for parity with spark clasic
### Does this PR
yaooqinn commented on PR #49506:
URL: https://github.com/apache/spark/pull/49506#issuecomment-2594693848
> Can we also update the comment in HiveClientImpl#getSparkSQLDataType?
I checked the comment and found them still suitable
--
This is an automated message from the A
zhengruifeng commented on code in PR #49525:
URL: https://github.com/apache/spark/pull/49525#discussion_r1917861976
##
python/pyspark/ml/tests/test_classification.py:
##
@@ -283,6 +293,318 @@ def test_logistic_regression(self):
except OSError:
pass
+d
HeartSaVioR commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1917753742
##
python/pyspark/sql/tests/pandas/test_pandas_transform_with_state.py:
##
@@ -1294,6 +1307,208 @@ def
test_transform_with_state_with_timers_single_partition(self)
zhengruifeng commented on code in PR #49525:
URL: https://github.com/apache/spark/pull/49525#discussion_r1917861976
##
python/pyspark/ml/tests/test_classification.py:
##
@@ -283,6 +293,318 @@ def test_logistic_regression(self):
except OSError:
pass
+d
cloud-fan commented on code in PR #49351:
URL: https://github.com/apache/spark/pull/49351#discussion_r1917867370
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -41,16 +47,113 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
zhengruifeng commented on code in PR #49525:
URL: https://github.com/apache/spark/pull/49525#discussion_r1917861976
##
python/pyspark/ml/tests/test_classification.py:
##
@@ -283,6 +293,318 @@ def test_logistic_regression(self):
except OSError:
pass
+d
cloud-fan commented on code in PR #49351:
URL: https://github.com/apache/spark/pull/49351#discussion_r1917868339
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -41,16 +47,113 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
cloud-fan commented on code in PR #49351:
URL: https://github.com/apache/spark/pull/49351#discussion_r1917871010
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -41,16 +47,113 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
cloud-fan commented on code in PR #49351:
URL: https://github.com/apache/spark/pull/49351#discussion_r1917869246
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveWithCTE.scala:
##
@@ -41,16 +47,113 @@ object ResolveWithCTE extends Rule[LogicalPlan] {
dongjoon-hyun closed pull request #49483: [SPARK-50811] Support enabling JVM
profiler on driver
URL: https://github.com/apache/spark/pull/49483
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
cloud-fan commented on PR #49506:
URL: https://github.com/apache/spark/pull/49506#issuecomment-2594627270
Can we also update the comment in `HiveClientImpl#getSparkSQLDataType`?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
cloud-fan commented on PR #49452:
URL: https://github.com/apache/spark/pull/49452#issuecomment-2594754634
thanks, merging to master/4.0!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specif
cloud-fan closed pull request #49452: [SPARK-50792][SQL] Format binary data as
a binary literal in JDBC.
URL: https://github.com/apache/spark/pull/49452
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun commented on PR #49516:
URL: https://github.com/apache/spark/pull/49516#issuecomment-2593760864
Thank you, @huaxingao .
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abo
dongjoon-hyun closed pull request #49516: [SPARK-50835][INFRA][FOLLOWUP] Use
Python 3.11 in `branch-4.0` `Python-only` Daily CI
URL: https://github.com/apache/spark/pull/49516
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and u
Ngone51 commented on PR #49413:
URL: https://github.com/apache/spark/pull/49413#issuecomment-2593001456
FYI I created a followup PR (https://github.com/apache/spark/pull/49508) to
use `TaskContext.createResourceUninterruptibly()` where it applies.
--
This is an automated message from the
milastdbx commented on PR #49452:
URL: https://github.com/apache/spark/pull/49452#issuecomment-2593033934
Can you provide your take on this:
> My understanding is that this never produced any kind of results that
customers could have adapted to - hence flag is not required. Is this your
stevomitric commented on PR #49510:
URL: https://github.com/apache/spark/pull/49510#issuecomment-2593422142
cc @stefankandic and @dejankrak-db
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun opened a new pull request, #49517:
URL: https://github.com/apache/spark/pull/49517
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### H
Ngone51 opened a new pull request, #49508:
URL: https://github.com/apache/spark/pull/49508
### What changes were proposed in this pull request?
This is a follow-up PR for https://github.com/apache/spark/pull/49413. This
PR intends to apply `TaskContext.createResourceUninte
pan3793 commented on PR #49492:
URL: https://github.com/apache/spark/pull/49492#issuecomment-2593031373
With SPARK-50810, the profiler module compilation is covered by regular
[PR's
CI](https://github.com/pan3793/spark/actions/runs/12788664164/job/35650454602),
I also verified it in an int
mihailoale-db opened a new pull request, #49509:
URL: https://github.com/apache/spark/pull/49509
### What changes were proposed in this pull request?
Right now fixed-point resolved plan is return as a result of analysis of the
unresolved plan. In this task we add a flag in order to guard
stefankandic commented on PR #49505:
URL: https://github.com/apache/spark/pull/49505#issuecomment-2593073600
@MaxGekk please take a look when you get the chance
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
Ngone51 commented on code in PR #49413:
URL: https://github.com/apache/spark/pull/49413#discussion_r1916744335
##
core/src/main/scala/org/apache/spark/BarrierTaskContext.scala:
##
@@ -273,6 +274,18 @@ class BarrierTaskContext private[spark] (
}
override private[spark] de
sunxiaoguang commented on PR #49452:
URL: https://github.com/apache/spark/pull/49452#issuecomment-2593063119
> Question: Prior to these changes is there any scenario where some data
sources might have returned something, although incorrect ? Is there a chance
that someone has adapted to thi
Kimahriman commented on PR #49005:
URL: https://github.com/apache/spark/pull/49005#issuecomment-2593356260
Gentle ping for potential inclusion in 4.0
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
MaxGekk commented on PR #49505:
URL: https://github.com/apache/spark/pull/49505#issuecomment-2593480884
@stefankandic Thanks for the ping. Will look at it soon.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
dongjoon-hyun opened a new pull request, #49511:
URL: https://github.com/apache/spark/pull/49511
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### H
ericm-db commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1917042825
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/SchemaHelper.scala:
##
@@ -202,7 +203,7 @@ object SchemaHelper {
}
}
- class Schem
ericm-db commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1917043160
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateSchemaCompatibilityChecker.scala:
##
@@ -206,22 +206,23 @@ class StateSchemaCompatibilit
mridulm commented on code in PR #49479:
URL: https://github.com/apache/spark/pull/49479#discussion_r1917048583
##
common/kvstore/src/main/java/org/apache/spark/util/kvstore/LevelDB.java:
##
@@ -176,11 +176,13 @@ public void writeAll(List values) throws Exception {
final I
dongjoon-hyun opened a new pull request, #49512:
URL: https://github.com/apache/spark/pull/49512
…
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
dongjoon-hyun opened a new pull request, #49514:
URL: https://github.com/apache/spark/pull/49514
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### H
dongjoon-hyun opened a new pull request, #49515:
URL: https://github.com/apache/spark/pull/49515
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### H
MaxGekk commented on code in PR #49505:
URL: https://github.com/apache/spark/pull/49505#discussion_r1917116160
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -882,12 +882,26 @@ object SQLConf {
.booleanConf
.createWithDefault(Util
dusantism-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1917163806
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ColumnResolutionHelper.scala:
##
@@ -266,22 +268,40 @@ trait ColumnResolutionHelper extends L
dusantism-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1917164149
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/VariableManager.scala:
##
@@ -23,8 +23,39 @@ import scala.collection.mutable
import org.apa
dongjoon-hyun commented on code in PR #49501:
URL: https://github.com/apache/spark/pull/49501#discussion_r1916105392
##
sql/core/src/main/scala/org/apache/spark/sql/execution/WholeStageCodegenExec.scala:
##
@@ -485,8 +487,16 @@ trait InputRDDCodegen extends CodegenSupport {
dongjoon-hyun commented on code in PR #49501:
URL: https://github.com/apache/spark/pull/49501#discussion_r1916107165
##
sql/core/src/test/scala/org/apache/spark/sql/execution/HashAggregateCodegenInterruptionSuite.scala:
##
@@ -0,0 +1,100 @@
+/*
+ * Licensed to the Apache Softwar
LuciferYang commented on PR #49502:
URL: https://github.com/apache/spark/pull/49502#issuecomment-2591985909
> The code seems to be correct, but let's submit a temporary code or comment
change to trigger the compilation of the `profiler` module. We can revert the
change after successfully ve
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916278553
##
sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4:
##
@@ -79,6 +81,29 @@ setStatementWithOptionalVarKeyword
LEFT_PAREN query R
pan3793 opened a new pull request, #49502:
URL: https://github.com/apache/spark/pull/49502
### What changes were proposed in this pull request?
Fix code change detection for the `profiler` module in GHA workflow scripts.
### Why are the changes needed?
Without thi
HeartSaVioR commented on code in PR #49479:
URL: https://github.com/apache/spark/pull/49479#discussion_r1916151008
##
common/kvstore/src/main/java/org/apache/spark/util/kvstore/LevelDB.java:
##
@@ -176,11 +176,13 @@ public void writeAll(List values) throws Exception {
fin
pan3793 commented on PR #49502:
URL: https://github.com/apache/spark/pull/49502#issuecomment-2591966495
cc @dongjoon-hyun and @LuciferYang, sorry for missing that.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
UR
LuciferYang commented on PR #49502:
URL: https://github.com/apache/spark/pull/49502#issuecomment-2591970640
The code seems to be correct, but let's submit a temporary code or comment
change to trigger the compilation of the `profiler` module. We can revert the
change after successfully veri
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916306452
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:
##
@@ -159,15 +159,97 @@ class AstBuilder extends DataTypeAstBuilder
script
beliefer commented on code in PR #49452:
URL: https://github.com/apache/spark/pull/49452#discussion_r1916306030
##
sql/core/src/main/scala/org/apache/spark/sql/jdbc/OracleDialect.scala:
##
@@ -61,6 +61,34 @@ private case class OracleDialect() extends JdbcDialect with
SQLConfHel
sunxiaoguang commented on code in PR #49452:
URL: https://github.com/apache/spark/pull/49452#discussion_r1916227472
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/V2JDBCTest.scala:
##
@@ -986,4 +986,39 @@ private[v2] trait V2JDBCTest extends Sh
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916321322
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:
##
@@ -159,15 +159,97 @@ class AstBuilder extends DataTypeAstBuilder
script
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916326005
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:
##
@@ -159,15 +159,97 @@ class AstBuilder extends DataTypeAstBuilder
script
sunxiaoguang commented on code in PR #49452:
URL: https://github.com/apache/spark/pull/49452#discussion_r1916238144
##
sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCV2Suite.scala:
##
@@ -3097,4 +3097,19 @@ class JDBCV2Suite extends QueryTest with
SharedSparkSession with
sunxiaoguang commented on code in PR #49452:
URL: https://github.com/apache/spark/pull/49452#discussion_r1916244276
##
sql/core/src/main/scala/org/apache/spark/sql/jdbc/OracleDialect.scala:
##
@@ -61,6 +61,34 @@ private case class OracleDialect() extends JdbcDialect with
SQLCon
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916342658
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SqlScriptingLogicalPlans.scala:
##
@@ -298,3 +303,53 @@ case class ForStatement(
For
miland-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916345852
##
sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4:
##
@@ -79,6 +81,29 @@ setStatementWithOptionalVarKeyword
LEFT_PAREN query R
Ngone51 commented on code in PR #49413:
URL: https://github.com/apache/spark/pull/49413#discussion_r1916546391
##
core/src/main/scala/org/apache/spark/TaskContextImpl.scala:
##
@@ -82,6 +83,13 @@ private[spark] class TaskContextImpl(
// If defined, the corresponding task has
davidm-db commented on code in PR #49427:
URL: https://github.com/apache/spark/pull/49427#discussion_r1916568838
##
sql/core/src/main/scala/org/apache/spark/sql/scripting/SqlScriptingExecutionContext.scala:
##
@@ -81,12 +107,79 @@ class SqlScriptingExecutionFrame(
scopes.
LuciferYang closed pull request #49502: [SPARK-50810][BUILD][FOLLOWUP] Fix
code change detection for profiler module
URL: https://github.com/apache/spark/pull/49502
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
LuciferYang commented on PR #49502:
URL: https://github.com/apache/spark/pull/49502#issuecomment-2592741473
Merged into master. Thanks @pan3793
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to th
jovanpavl-db commented on PR #49510:
URL: https://github.com/apache/spark/pull/49510#issuecomment-2593500342
> LGTM, I would also like a confirmation from @jovanpavl-db just to double
check that there are no blockers for enabling this by default
Yes, ready to go.
--
This is an auto
mridulm commented on code in PR #49479:
URL: https://github.com/apache/spark/pull/49479#discussion_r1917048583
##
common/kvstore/src/main/java/org/apache/spark/util/kvstore/LevelDB.java:
##
@@ -176,11 +176,13 @@ public void writeAll(List values) throws Exception {
final I
dongjoon-hyun commented on PR #49495:
URL: https://github.com/apache/spark/pull/49495#issuecomment-2593466916
Thank you, @HyukjinKwon , @cloud-fan , @LuciferYang , @panbingkun .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
asl3 opened a new pull request, #49513:
URL: https://github.com/apache/spark/pull/49513
### What changes were proposed in this pull request?
When storing table metadata in the linkedhashmap object, we retain the
`long` timestamp type to allow flexibility and extensibility
dongjoon-hyun commented on PR #49511:
URL: https://github.com/apache/spark/pull/49511#issuecomment-2593676647
Thank you, @huaxingao !
Merged to branch-4.0
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
dongjoon-hyun closed pull request #49514: [SPARK-50834][INFRA] Add Daily Github
Action CI to `branch-4.0`
URL: https://github.com/apache/spark/pull/49514
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun closed pull request #49511: [SPARK-50832][INFRA][4.0] Add GitHub
Action jobs for branch-4.0
URL: https://github.com/apache/spark/pull/49511
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dusantism-db commented on code in PR #49445:
URL: https://github.com/apache/spark/pull/49445#discussion_r1917164508
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/VariableManager.scala:
##
@@ -23,8 +23,39 @@ import scala.collection.mutable
import org.apa
dongjoon-hyun closed pull request #49515: [SPARK-50835][INFRA] Add
`Python-only` Daily CI to branch-4.0
URL: https://github.com/apache/spark/pull/49515
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
dongjoon-hyun commented on PR #49514:
URL: https://github.com/apache/spark/pull/49514#issuecomment-2593682772
Thank you, @huaxingao !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #49515:
URL: https://github.com/apache/spark/pull/49515#issuecomment-2593685181
Thank you, @huaxingao .
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abo
dongjoon-hyun commented on PR #49514:
URL: https://github.com/apache/spark/pull/49514#issuecomment-2593692440
For the record, this is applied successfully.
- https://github.com/apache/spark/actions/workflows/build_branch40.yml

.version("4.0.0")
.bool
dongjoon-hyun commented on code in PR #49510:
URL: https://github.com/apache/spark/pull/49510#discussion_r1917400275
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -880,7 +880,7 @@ object SQLConf {
)
.version("4.0.0")
.bool
201 - 283 of 283 matches
Mail list logo