manuzhang opened a new issue, #2702:
URL: https://github.com/apache/datafusion-comet/issues/2702

   ### Describe the bug
   
   The following test failed in 
https://github.com/apache/datafusion-comet/actions/runs/19120575376/job/54639992128
   ```
   2025-11-06T01:01:11.4965915Z [info] AdaptiveQueryExecSuite:
   ....
   2025-11-06T01:01:34.8627570Z [info] - SPARK-37753: Inhibit broadcast in left 
outer join when there are many empty partitions on outer/left side *** FAILED 
*** (15 seconds, 225 milliseconds)
   2025-11-06T01:01:34.8638601Z [info]   The code passed to eventually never 
returned normally. Attempted 20 times over 15.22182114 seconds. Last failure 
message: List(BroadcastHashJoin [key#80702], [a#80714], LeftOuter, BuildRight, 
false
   2025-11-06T01:01:34.8640018Z [info]   :- CometSinkPlaceHolder [key#80702, 
value#80703]
   2025-11-06T01:01:34.8640495Z [info]   :  +- ShuffleQueryStage 0
   2025-11-06T01:01:34.8641401Z [info]   :     +- CometColumnarExchange 
hashpartitioning(key#80702, 5), ENSURE_REQUIREMENTS, CometColumnarShuffle, 
[plan_id=461654]
   2025-11-06T01:01:34.8642278Z [info]   :        +- *(1) Filter 
(isnotnull(value#80703) AND (value#80703 = 1))
   2025-11-06T01:01:34.8644092Z [info]   :           +- *(1) 
SerializeFromObject [invoke(knownnotnull(assertnotnull(input[0, 
org.apache.spark.sql.test.SQLTestData$TestData, true])).key()) AS key#80702, 
static_invoke(UTF8String.fromString(invoke(knownnotnull(assertnotnull(input[0, 
org.apache.spark.sql.test.SQLTestData$TestData, true])).value()))) AS 
value#80703]
   2025-11-06T01:01:34.8645828Z [info]   :              +- Scan[obj#80699]
   2025-11-06T01:01:34.8646273Z [info]   +- CometSinkPlaceHolder [a#80714, 
b#80715]
   2025-11-06T01:01:34.8646755Z [info]      +- BroadcastQueryStage 2
   2025-11-06T01:01:34.8647548Z [info]         +- CometBroadcastExchange 
[a#80714, b#80715]
   2025-11-06T01:01:34.8648006Z [info]            +- AQEShuffleRead local
   2025-11-06T01:01:34.8648778Z [info]               +- ShuffleQueryStage 1
   2025-11-06T01:01:34.8649677Z [info]                  +- 
CometColumnarExchange hashpartitioning(a#80714, 5), ENSURE_REQUIREMENTS, 
CometColumnarShuffle, [plan_id=461676]
   2025-11-06T01:01:34.8651664Z [info]                     +- *(2) 
SerializeFromObject [invoke(knownnotnull(assertnotnull(input[0, 
org.apache.spark.sql.test.SQLTestData$TestData2, true])).a()) AS a#80714, 
invoke(knownnotnull(assertnotnull(input[0, 
org.apache.spark.sql.test.SQLTestData$TestData2, true])).b()) AS b#80715]
   2025-11-06T01:01:34.8653541Z [info]                        +- Scan[obj#80711]
   2025-11-06T01:01:34.8654081Z [info]   ) was not empty. 
(AdaptiveQueryExecSuite.scala:775)
   2025-11-06T01:01:34.8654772Z [info]   
org.scalatest.exceptions.TestFailedDueToTimeoutException:
   ```
   
   It's flaky test at Spark side and has been fixed in 
https://github.com/apache/spark/pull/52388, but since we only test on Spark 4.0 
the tests might fail again.
   
   ### Steps to reproduce
   
   _No response_
   
   ### Expected behavior
   
   _No response_
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to