Thank you, Paddy.

Dongjoon.



On Mon, Jan 20, 2025 at 2:32 AM Paddy Xu <xupa...@gmail.com> wrote:

> I have worked on tests related to “interrupt”. Not sure about SPARK-50888:
>
> My findings:
> 1. These test failures only occur in the GitHub CI.
> 2. The failure is due to the thread pool we created in CI having only two
> threads, while our tests require three concurrent threads to run.
>
> To workaround this issue we could split these tests into smaller ones that
> require only two threads.
> I’ll see if there’s a volunteer. Otherwise I could take a stab at fixing
> them.
>
> Cheers,
> Paddy
>
> On 2025/01/19 07:52:00 Dongjoon Hyun wrote:
> > Hi, All.
> >
> > This is a kind of head-up as a part of Apache Spark 4.0.0 preparation.
> >
> > https://issues.apache.org/jira/browse/SPARK-44111
> > (Prepare Apache Spark 4.0.0)
> >
> > It would be great if we are able to fix long-standing `Spark Connect`
> test
> > flakiness
> > together during the QA period (2025-02-01 ~) in order to make sure that
> we
> > didn't
> > miss any bugs here.
> >
> > SPARK-48139: Re-enable `SparkSessionE2ESuite.interrupt tag`
> > SPARK-50205: Re-enable
> > `SparkSessionJobTaggingAndCancellationSuite.Cancellation APIs in
> > SparkSession are isolated`
> > SPARK-50748: Fix a flaky test: `SparkSessionE2ESuite.interrupt all -
> > background queries, foreground interrupt`
> > SPARK-50888: Fix Flaky Test: `SparkConnectServiceSuite.SPARK-44776:
> > LocalTableScanExe`
> > SPARK-50889: Fix Flaky Test: `SparkSessionE2ESuite.interrupt operation`
> > (Hang)
> >
> > Since many people consider the `Spark Connect` module important,
> > some of JIRA issues are marked as `Blocker` priority for 4.0.0 currently.
> > So, I hope we can make them re-enabled and stable to be safe.
> >
> > However, we can also simply remove `Blocker` status before RC1
> (2025-02-15)
> > and ignore all of those test cases if `Spark Connect` module experts are
> > sure that
> > there are no `Spark Connect` functional issues behind these.
> > In this case, the above issues will be re-scoped to Spark 4.1.0 as
> `Minor`
> > priorities.
> >
> > Sincerely,
> > Dongjoon.
> >
>
> Cheers,
> Paddy
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to