LuciferYang commented on PR #50264:
URL: https://github.com/apache/spark/pull/50264#issuecomment-2720361675
Merged into branch-4.0. Thanks @pan3793
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
srielau commented on PR #45593:
URL: https://github.com/apache/spark/pull/45593#issuecomment-2720341278
> It could be extended to any comma separated list, such as the `SELECT`
(Rejected here: #48961) clause and CTEs.
>
> I hope this will be reconsidered. DuckDB, Snowflake but also th
LuciferYang commented on PR #50249:
URL: https://github.com/apache/spark/pull/50249#issuecomment-2720286766
Thank you @dongjoon-hyun and @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
beliefer commented on PR #50265:
URL: https://github.com/apache/spark/pull/50265#issuecomment-2720230178
@wForget @viirya Thanks
Merged into branch-3.5
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above t
LuciferYang closed pull request #50264: [SPARK-51466][SQL][HIVE][4.0] Eliminate
Hive built-in UDFs initialization on Hive UDF evaluation
URL: https://github.com/apache/spark/pull/50264
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to Git
anishshri-db commented on code in PR #50168:
URL: https://github.com/apache/spark/pull/50168#discussion_r1993203757
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStore.scala:
##
@@ -851,14 +853,16 @@ object StateStore extends Logging {
thr
anishshri-db commented on PR #50168:
URL: https://github.com/apache/spark/pull/50168#issuecomment-2720722448
> Looks like the config we are adding is confusing since user can't reason
about 5x of timeout they have set.
@HeartSaVioR - Updated the PR. PTAL, thanks !
--
This is an aut
beliefer commented on PR #50143:
URL: https://github.com/apache/spark/pull/50143#issuecomment-2723461861
@cloud-fan Thank you!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment
dongjoon-hyun commented on PR #17:
URL:
https://github.com/apache/spark-connect-swift/pull/17#issuecomment-2723329511
Thank you for helping this moving forward, @viirya !
Merged to main.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
dongjoon-hyun commented on code in PR #17:
URL:
https://github.com/apache/spark-connect-swift/pull/17#discussion_r1994561481
##
Sources/SparkConnect/SparkConnectClient.swift:
##
@@ -275,9 +275,11 @@ public actor SparkConnectClient {
let expressions: [Spark_Connect_Expressi
cnauroth commented on PR #50223:
URL: https://github.com/apache/spark/pull/50223#issuecomment-2723329648
Thank you for the reviews and commit everyone.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
dongjoon-hyun commented on code in PR #17:
URL:
https://github.com/apache/spark-connect-swift/pull/17#discussion_r1994641502
##
Sources/SparkConnect/DataFrame.swift:
##
@@ -153,16 +153,35 @@ public actor DataFrame: Sendable {
let arrowResult = ArrowReader.makeArrow
dongjoon-hyun closed pull request #17: [SPARK-51508] Support `collect():
[[String?]]` for `DataFrame`
URL: https://github.com/apache/spark-connect-swift/pull/17
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abov
cloud-fan commented on PR #50143:
URL: https://github.com/apache/spark/pull/50143#issuecomment-2723361616
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific c
dongjoon-hyun commented on PR #50274:
URL: https://github.com/apache/spark/pull/50274#issuecomment-2723616696
Thank you, @yaooqinn and @ShreyeshArangath
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on PR #18:
URL:
https://github.com/apache/spark-connect-swift/pull/18#issuecomment-2723694577
Could you review this test framework PR when you have some time,
@LuciferYang ?
--
This is an automated message from the Apache Git Service.
To respond to the message, pl
dongjoon-hyun commented on PR #18:
URL:
https://github.com/apache/spark-connect-swift/pull/18#issuecomment-2723809582
Thank you, @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to th
dongjoon-hyun closed pull request #18: [SPARK-51510] Add SQL-file based
`SQLTests` suite
URL: https://github.com/apache/spark-connect-swift/pull/18
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to th
HeartSaVioR commented on code in PR #50195:
URL: https://github.com/apache/spark/pull/50195#discussion_r1994957715
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -2233,6 +2233,19 @@ object SQLConf {
.intConf
.createWithDefault(10)
HeartSaVioR commented on code in PR #50195:
URL: https://github.com/apache/spark/pull/50195#discussion_r1994957313
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -2233,6 +2233,19 @@ object SQLConf {
.intConf
.createWithDefault(10)
viirya commented on code in PR #17:
URL:
https://github.com/apache/spark-connect-swift/pull/17#discussion_r1994639317
##
Sources/SparkConnect/DataFrame.swift:
##
@@ -153,16 +153,35 @@ public actor DataFrame: Sendable {
let arrowResult = ArrowReader.makeArrowReaderR
dongjoon-hyun commented on code in PR #17:
URL:
https://github.com/apache/spark-connect-swift/pull/17#discussion_r1994640208
##
Sources/SparkConnect/DataFrame.swift:
##
@@ -153,16 +153,35 @@ public actor DataFrame: Sendable {
let arrowResult = ArrowReader.makeArrow
dongjoon-hyun commented on code in PR #49501:
URL: https://github.com/apache/spark/pull/49501#discussion_r1994671983
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -2048,6 +2048,16 @@ object SQLConf {
.checkValue(threshold => threshold > 0
dongjoon-hyun opened a new pull request, #50274:
URL: https://github.com/apache/spark/pull/50274
…
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
dongjoon-hyun commented on PR #50274:
URL: https://github.com/apache/spark/pull/50274#issuecomment-2723425549
Could you review this PR, @yaooqinn ? It would be great if we can have this
nice feature in `Spark Master` too.
--
This is an automated message from the Apache Git Service.
To res
dongjoon-hyun commented on code in PR #49501:
URL: https://github.com/apache/spark/pull/49501#discussion_r1994669214
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -2048,6 +2048,16 @@ object SQLConf {
.checkValue(threshold => threshold > 0
cloud-fan commented on code in PR #49908:
URL: https://github.com/apache/spark/pull/49908#discussion_r1994682964
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Mode.scala:
##
@@ -188,7 +188,8 @@ case class Mode(
assert(orderingFilled || (
beliefer commented on code in PR #49908:
URL: https://github.com/apache/spark/pull/49908#discussion_r1994713088
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Mode.scala:
##
@@ -188,7 +188,8 @@ case class Mode(
assert(orderingFilled || (!
dongjoon-hyun opened a new pull request, #18:
URL: https://github.com/apache/spark-connect-swift/pull/18
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
cloud-fan commented on code in PR #50192:
URL: https://github.com/apache/spark/pull/50192#discussion_r1994663277
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##
@@ -4082,3 +4084,31 @@ object RemoveTempResolvedColumn extends
Rule[LogicalP
HeartSaVioR closed pull request #50168: [SPARK-51397][SS] Fix maintenance pool
shutdown handling issue causing long test times
URL: https://github.com/apache/spark/pull/50168
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and us
huaxingao opened a new pull request, #50275:
URL: https://github.com/apache/spark/pull/50275
### What changes were proposed in this pull request?
Since both `commandOptions` and `dsOptions` are `CaseInsensitiveStringMap`
objects, I think we should convert the keys and values to lo
LuciferYang commented on PR #50266:
URL: https://github.com/apache/spark/pull/50266#issuecomment-2723363858
late LGTM, Thank you @MaxGekk
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
cloud-fan closed pull request #50143: [SPARK-51380][SQL] Add visitSQLFunction
and visitAggregateFunction to improve the flexibility of V2ExpressionSQLBuilder
URL: https://github.com/apache/spark/pull/50143
--
This is an automated message from the Apache Git Service.
To respond to the message,
HeartSaVioR commented on PR #50168:
URL: https://github.com/apache/spark/pull/50168#issuecomment-2723487378
CI has passed. Thanks! Merging to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
101 - 135 of 135 matches
Mail list logo