[ 
https://issues.apache.org/jira/browse/SPARK-48163?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-48163:
----------------------------------
    Description: 
This is a long standing flakiness from early 2024 to now.
- https://github.com/apache/spark/actions/runs/12882534288/job/35914995457 
(2025-01-21)

{code}
- SPARK-43923: commands send events ((get_resources_command {
[info] }
[info] ,None)) *** FAILED *** (35 milliseconds)
[info]   VerifyEvents.this.listener.executeHolder.isDefined was false 
(SparkConnectServiceSuite.scala:873)
{code}

  was:
{code}
- SPARK-43923: commands send events ((get_resources_command {
[info] }
[info] ,None)) *** FAILED *** (35 milliseconds)
[info]   VerifyEvents.this.listener.executeHolder.isDefined was false 
(SparkConnectServiceSuite.scala:873)
{code}


> Fix Flaky Test: `SparkConnectServiceSuite.SPARK-43923: commands send events - 
> get_resources_command`
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-48163
>                 URL: https://issues.apache.org/jira/browse/SPARK-48163
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Tests
>    Affects Versions: 4.0.0
>            Reporter: Dongjoon Hyun
>            Priority: Major
>              Labels: pull-request-available
>
> This is a long standing flakiness from early 2024 to now.
> - https://github.com/apache/spark/actions/runs/12882534288/job/35914995457 
> (2025-01-21)
> {code}
> - SPARK-43923: commands send events ((get_resources_command {
> [info] }
> [info] ,None)) *** FAILED *** (35 milliseconds)
> [info]   VerifyEvents.this.listener.executeHolder.isDefined was false 
> (SparkConnectServiceSuite.scala:873)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to