beliefer commented on code in PR #50107:
URL: https://github.com/apache/spark/pull/50107#discussion_r1976295039
##
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala:
##
@@ -169,6 +171,7 @@ private[spark] class TaskSchedulerImpl(
protected val executorIdTo
beliefer commented on PR #49453:
URL: https://github.com/apache/spark/pull/49453#issuecomment-2692142755
>I assume features like this should go through a RFC procedure. I will try
to figure it out myself. Meanwhile, I would really appreciate if you can give
me some hints of previous work si
beliefer commented on code in PR #49961:
URL: https://github.com/apache/spark/pull/49961#discussion_r1976389689
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/python/PythonScanBuilder.scala:
##
@@ -25,6 +27,40 @@ class PythonScanBuilder(
ds: Python
sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976368257
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite
asfgit closed pull request #50023: [SPARK-49960][SQL] Custom ExpressionEncoder
support and TransformingEncoder fixes
URL: https://github.com/apache/spark/pull/50023
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
beliefer commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1976381083
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -173,7 +192,8 @@ private[spark] class BarrierCoordinator(
// we may timeout for the sy
beliefer commented on code in PR #50113:
URL: https://github.com/apache/spark/pull/50113#discussion_r1976385073
##
resource-managers/kubernetes/integration-tests/README.md:
##
@@ -199,9 +199,9 @@ to the wrapper scripts and using the wrapper scripts will
simply set these appro
beliefer commented on code in PR #50113:
URL: https://github.com/apache/spark/pull/50113#discussion_r1976382408
##
project/SparkBuild.scala:
##
@@ -1002,10 +1002,9 @@ object KubernetesIntegrationTests {
if (excludeTags.exists(_.equalsIgnoreCase("r"))) {
rDock
beliefer commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976385645
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite exte
beliefer commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976386495
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite exte
sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976402306
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite
wangyum commented on code in PR #47998:
URL: https://github.com/apache/spark/pull/47998#discussion_r1976206065
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala:
##
@@ -415,8 +415,11 @@ private[client] class Shim_v2_0 extends Shim with Logging {
t
sunxiaoguang commented on PR #49453:
URL: https://github.com/apache/spark/pull/49453#issuecomment-2692188145
> > I assume features like this should go through a RFC procedure. I will
try to figure it out myself. Meanwhile, I would really appreciate if you can
give me some hints of previous
beliefer commented on code in PR #50107:
URL: https://github.com/apache/spark/pull/50107#discussion_r1976406171
##
core/src/main/scala/org/apache/spark/ui/ConsoleProgressBar.scala:
##
@@ -121,5 +121,8 @@ private[spark] class ConsoleProgressBar(sc: SparkContext)
extends Logging
ericm-db commented on code in PR #49277:
URL: https://github.com/apache/spark/pull/49277#discussion_r1975820792
##
python/pyspark/sql/tests/pandas/test_pandas_transform_with_state.py:
##
@@ -1294,6 +1310,167 @@ def
test_transform_with_state_with_timers_single_partition(self):
attilapiros commented on PR #50122:
URL: https://github.com/apache/spark/pull/50122#issuecomment-2692480193
cc @mridulm, @Ngone51
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comm
HyukjinKwon commented on code in PR #50099:
URL: https://github.com/apache/spark/pull/50099#discussion_r1976523920
##
python/pyspark/sql/pandas/serializers.py:
##
@@ -175,6 +178,16 @@ def wrap_and_init_stream():
return super(ArrowStreamUDFSerializer,
self).dump_stream(
github-actions[bot] closed pull request #48108: [SPARK-49644][SQL] Support drop
multi-level partition V2 table with partial partition spec
URL: https://github.com/apache/spark/pull/48108
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to G
github-actions[bot] closed pull request #48871: [SPARK-42856][GRAPHX] Break tie
with highest vertex id in label propagation
URL: https://github.com/apache/spark/pull/48871
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use t
github-actions[bot] commented on PR #48811:
URL: https://github.com/apache/spark/pull/48811#issuecomment-2692487092
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
eason-yuchen-liu opened a new pull request, #50124:
URL: https://github.com/apache/spark/pull/50124
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
##
sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976547616
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite
sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976547616
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite
sunxiaoguang commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976547616
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite
HyukjinKwon closed pull request #50116: [MINOR][DOCS] Fix code comments for
Executor
URL: https://github.com/apache/spark/pull/50116
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comm
HyukjinKwon commented on PR #50116:
URL: https://github.com/apache/spark/pull/50116#issuecomment-2692512041
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon closed pull request #50120: [MINOR][TESTS][CONNECT] Fix the
teardown function of test_connect_function.py
URL: https://github.com/apache/spark/pull/50120
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #50120:
URL: https://github.com/apache/spark/pull/50120#issuecomment-2692514862
Merged to master and branch-4.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
szehon-ho commented on PR #50109:
URL: https://github.com/apache/spark/pull/50109#issuecomment-2692426656
@cloud-fan @aokolnychyi do you want to look when you have a chance, thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to Git
beliefer commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1976539087
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -122,23 +124,40 @@ private[spark] class BarrierCoordinator(
// Init a TimerTask for a barr
beliefer commented on code in PR #50020:
URL: https://github.com/apache/spark/pull/50020#discussion_r1976539267
##
core/src/main/scala/org/apache/spark/BarrierCoordinator.scala:
##
@@ -122,23 +124,40 @@ private[spark] class BarrierCoordinator(
// Init a TimerTask for a barr
beliefer commented on code in PR #49453:
URL: https://github.com/apache/spark/pull/49453#discussion_r1976539546
##
connector/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/v2/MySQLIntegrationSuite.scala:
##
@@ -241,6 +241,84 @@ class MySQLIntegrationSuite exte
beliefer commented on PR #49453:
URL: https://github.com/apache/spark/pull/49453#issuecomment-2692553915
> Thanks for the hint. Another question, do we need to write a complete
design document and going through a review process about the design before
actually writing the implementation?
33 matches
Mail list logo