dongjoon-hyun closed pull request #14: [SPARK-51493] Refine `merge_spark_pr.py`
to use `connect-swift-x.y.z` version
URL: https://github.com/apache/spark-connect-swift/pull/14
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and u
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992404121
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992468710
##
Tests/SparkConnectTests/DataFrameTests.swift:
##
@@ -81,6 +81,7 @@ struct DataFrameTests {
await spark.stop()
}
+#if !os(Linux)
Review Comm
dongjoon-hyun opened a new pull request, #15:
URL: https://github.com/apache/spark-connect-swift/pull/15
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
zhengruifeng commented on code in PR #50199:
URL: https://github.com/apache/spark/pull/50199#discussion_r1992347220
##
python/pyspark/ml/classification.py:
##
@@ -909,7 +912,10 @@ def evaluate(self, dataset: DataFrame) ->
"LinearSVCSummary":
if not isinstance(dataset,
zhengruifeng commented on code in PR #50199:
URL: https://github.com/apache/spark/pull/50199#discussion_r1992344748
##
python/pyspark/ml/classification.py:
##
@@ -889,7 +889,10 @@ def summary(self) -> "LinearSVCTrainingSummary": # type:
ignore[override]
trained on the
zhengruifeng commented on code in PR #50199:
URL: https://github.com/apache/spark/pull/50199#discussion_r1992343085
##
python/pyspark/ml/util.py:
##
@@ -113,6 +113,11 @@ def invoke_remote_attribute_relation(
methods, obj_ref = _extract_id_methods(instance._java_obj)
me
ebonnal commented on PR #39691:
URL: https://github.com/apache/spark/pull/39691#issuecomment-2719284145
Any plans to finalize this work? The review doesn’t seem to highlight any
major blockers, right? Was it abandoned following an internal discussion?
cc @wangyum 🙏🏻
--
This is an
0xbadidea opened a new pull request, #50247:
URL: https://github.com/apache/spark/pull/50247
### What changes were proposed in this pull request?
rephrased documentation for configuration
### Why are the changes needed?
Fixed grammar.
### Does t
aokolnychyi commented on code in PR #50109:
URL: https://github.com/apache/spark/pull/50109#discussion_r1992411229
##
docs/sql-ref-ansi-compliance.md:
##
@@ -648,6 +648,7 @@ Below is a list of all the keywords in Spark SQL.
|PRECEDING|non-reserved|non-reserved|non-reserved|
|P
zhengruifeng opened a new pull request, #50262:
URL: https://github.com/apache/spark/pull/50262
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### Ho
pan3793 commented on PR #50232:
URL: https://github.com/apache/spark/pull/50232#issuecomment-2719619876
@dongjoon-hyun I opend https://github.com/apache/spark/pull/50264 for 4.0
backport
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
pan3793 commented on PR #50222:
URL: https://github.com/apache/spark/pull/50222#issuecomment-2719620776
Close and in favor SPARK-51449 (https://github.com/apache/spark/pull/50222)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHu
harshmotw-db commented on code in PR #50108:
URL: https://github.com/apache/spark/pull/50108#discussion_r1992384542
##
sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala:
##
@@ -326,7 +326,13 @@ object QueryTest extends Assertions {
// For binary arrays, we conver
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992458658
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -2236,6 +2236,19 @@ object SQLConf {
.booleanConf
.createWithDefault(t
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992468710
##
Tests/SparkConnectTests/DataFrameTests.swift:
##
@@ -81,6 +81,7 @@ struct DataFrameTests {
await spark.stop()
}
+#if !os(Linux)
Review Comm
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992468989
##
Tests/SparkConnectTests/RuntimeConfTests.swift:
##
@@ -31,7 +31,7 @@ struct RuntimeConfTests {
_ = try await client.connect(UUID().uuidString)
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992469308
##
Tests/SparkConnectTests/SparkSessionTests.swift:
##
@@ -56,7 +56,6 @@ struct SparkSessionTests {
@Test
func conf() async throws {
let spark
github-actions[bot] commented on PR #45802:
URL: https://github.com/apache/spark/pull/45802#issuecomment-2719432206
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #49026:
URL: https://github.com/apache/spark/pull/49026#issuecomment-2719432121
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #49019:
URL: https://github.com/apache/spark/pull/49019#issuecomment-2719432155
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #48967:
URL: https://github.com/apache/spark/pull/48967#issuecomment-2719432174
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
viirya commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992473785
##
Tests/SparkConnectTests/DataFrameTests.swift:
##
@@ -81,6 +81,7 @@ struct DataFrameTests {
await spark.stop()
}
+#if !os(Linux)
Review Comment:
dongjoon-hyun closed pull request #50259: [SPARK-43221][CORE][4.0] Host local
block fetching should use a block status of a block stored on disk
URL: https://github.com/apache/spark/pull/50259
--
This is an automated message from the Apache Git Service.
To respond to the message, please log o
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992475005
##
Tests/SparkConnectTests/DataFrameTests.swift:
##
@@ -81,6 +81,7 @@ struct DataFrameTests {
await spark.stop()
}
+#if !os(Linux)
Review Comm
dongjoon-hyun commented on PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#issuecomment-2719437716
Thank you, @viirya . Merged to main.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
github-actions[bot] closed pull request #49020: [WIP][SPARK-][Collation]
Prevent Regex with collated strings
URL: https://github.com/apache/spark/pull/49020
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abov
dongjoon-hyun commented on PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#issuecomment-2719427616
Could you review this when you have some time, @viirya ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub an
dongjoon-hyun commented on PR #13:
URL:
https://github.com/apache/spark-connect-swift/pull/13#issuecomment-2718738366
Could you review this when you have some time, please, @huaxingao ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on t
ahshahid opened a new pull request, #50263:
URL: https://github.com/apache/spark/pull/50263
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How wa
tedyu commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1992516501
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -134,11 +138,8 @@ private[spark] class BarrierCoordinator(
// Cancel the current active Tim
tedyu commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1992516501
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -134,11 +138,8 @@ private[spark] class BarrierCoordinator(
// Cancel the current active Tim
huaxingao commented on code in PR #50246:
URL: https://github.com/apache/spark/pull/50246#discussion_r1992558449
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/RewriteRowLevelCommand.scala:
##
@@ -273,9 +273,8 @@ trait RewriteRowLevelCommand extends Rule[L
pan3793 closed pull request #50222: [SPARK-51449][BUILD] Restore
hive-llap-common to compile scope
URL: https://github.com/apache/spark/pull/50222
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
beliefer commented on PR #50223:
URL: https://github.com/apache/spark/pull/50223#issuecomment-2719643055
@jjayadeep06 I'm sorry for the suggestion at
https://github.com/apache/spark/pull/50020#issuecomment-2705780937
I want you create a backport PR for branch-3.5, since I read the issue w
beliefer closed pull request #50245: [SPARK-48922][SQL] Avoid redundant array
transform of identical expression for map type
URL: https://github.com/apache/spark/pull/50245
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
beliefer commented on PR #50245:
URL: https://github.com/apache/spark/pull/50245#issuecomment-2719714521
@wForget Could you create backport PR for branch-3.5
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abov
wForget opened a new pull request, #50265:
URL: https://github.com/apache/spark/pull/50265
### What changes were proposed in this pull request?
Similar to #47843, this patch avoids ArrayTransform in `resolveMapType`
function if the resolution expression is the same as input param.
LuciferYang commented on PR #50251:
URL: https://github.com/apache/spark/pull/50251#issuecomment-2718510472
Merged into master. Thanks @MaxGekk
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to th
beliefer commented on code in PR #49961:
URL: https://github.com/apache/spark/pull/49961#discussion_r1992622771
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/python/PythonDataSourceV2.scala:
##
@@ -52,6 +52,11 @@ class PythonDataSourceV2 extends TableP
beliefer commented on PR #49961:
URL: https://github.com/apache/spark/pull/49961#issuecomment-2719735902
cc @cloud-fan @allisonwang-db
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specifi
zhengruifeng commented on code in PR #50199:
URL: https://github.com/apache/spark/pull/50199#discussion_r1992343085
##
python/pyspark/ml/util.py:
##
@@ -113,6 +113,11 @@ def invoke_remote_attribute_relation(
methods, obj_ref = _extract_id_methods(instance._java_obj)
me
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992483531
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -66,9 +86,9 @@ object StateStoreCoordinatorRef extends Lo
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992486033
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992490417
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
dongjoon-hyun closed pull request #50261: [SPARK-51494][BUILD] Upgrade to
Apache parent pom 33
URL: https://github.com/apache/spark/pull/50261
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spe
dependabot[bot] opened a new pull request, #130:
URL: https://github.com/apache/spark-connect-go/pull/130
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.34.0 to
0.36.0.
Commits
https://github.com/golang/net/commit/85d1d54551b68719346cb9fec24b911da4e452a1";>85d1d
beliefer commented on code in PR #49961:
URL: https://github.com/apache/spark/pull/49961#discussion_r1985896731
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/python/PythonScanBuilder.scala:
##
@@ -25,6 +27,40 @@ class PythonScanBuilder(
ds: Python
jjayadeep06 commented on PR #50223:
URL: https://github.com/apache/spark/pull/50223#issuecomment-2719772362
> @jjayadeep06 I'm sorry for the suggestion at [#50020
(comment)](https://github.com/apache/spark/pull/50020#issuecomment-2705780937)
I want you create a backport PR for branch-3.5, s
siying commented on PR #50257:
URL: https://github.com/apache/spark/pull/50257#issuecomment-2719796281
This seems to be lazily evaluated, so it's not a problem.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
siying closed pull request #50257: [SPARK-51492][SS]FileStreamSource: Avoid
expensive string concatenation if not needed
URL: https://github.com/apache/spark/pull/50257
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
yaooqinn commented on PR #9:
URL:
https://github.com/apache/spark-connect-swift/pull/9#issuecomment-2719566864
Thank you @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the speci
attilapiros commented on PR #50260:
URL: https://github.com/apache/spark/pull/50260#issuecomment-2719656434
Fixed. It was a difference between Scala 2.12 vs 2.13.
The `Option#zip` gives back an `Iterable` on Scala 2.12 instead of an
`Option`:
```
# scala
Welcome to Scala 2.12.8
Nicolas-Parot-Alvarez-Paidy commented on PR #45593:
URL: https://github.com/apache/spark/pull/45593#issuecomment-2719763453
I hope this will be reconsidered.
DuckDB, Snowflake but also the latest versions of Java, Python and Scala all
support it.
It clearly is a movement in the so
jayadeep-jayaraman commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1992757166
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -134,11 +138,8 @@ private[spark] class BarrierCoordinator(
// Cancel the curre
dongjoon-hyun commented on code in PR #15:
URL:
https://github.com/apache/spark-connect-swift/pull/15#discussion_r1992468710
##
Tests/SparkConnectTests/DataFrameTests.swift:
##
@@ -81,6 +81,7 @@ struct DataFrameTests {
await spark.stop()
}
+#if !os(Linux)
Review Comm
cnauroth commented on PR #50261:
URL: https://github.com/apache/spark/pull/50261#issuecomment-2719935801
I appreciate it. Thank you, @dongjoon-hyun !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
kazuyukitanimura commented on PR #50245:
URL: https://github.com/apache/spark/pull/50245#issuecomment-2719938188
Thank you @wForget late LGTM
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to t
beliefer commented on PR #50235:
URL: https://github.com/apache/spark/pull/50235#issuecomment-2719934092
@dongjoon-hyun To be honestly, it has extremely low performance overhead.
Let me update the PR's description.
I thought it will be good to cache the string value.
--
This is an auto
beliefer closed pull request #50235: [SPARK-51469][SQL] Improve
MapKeyDedupPolicy so that avoid calling toString
URL: https://github.com/apache/spark/pull/50235
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abov
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992793749
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992795230
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
kazuyukitanimura commented on PR #50265:
URL: https://github.com/apache/spark/pull/50265#issuecomment-2719960307
LGTM Thank you @wForget
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the speci
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992490417
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
zecookiez commented on code in PR #50123:
URL: https://github.com/apache/spark/pull/50123#discussion_r1992390682
##
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/StateStoreCoordinator.scala:
##
@@ -168,9 +220,99 @@ private class StateStoreCoordinator(ove
101 - 165 of 165 matches
Mail list logo