Time for Spark 3.3.1 release?

2022-09-12 Thread Yuming Wang
Hi, All.



Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches including 7
correctness patches arrived at branch-3.3.



Shall we make a new release, Apache Spark 3.3.1, as the second release at
branch-3.3? I'd like to volunteer as the release manager for Apache Spark 3.
3.1.



All changes:

https://github.com/apache/spark/compare/v3.3.0...branch-3.3



Correctness issues:

SPARK-40149: Propagate metadata columns through Project

SPARK-40002: Don't push down limit through window using ntile

SPARK-39976: ArrayIntersect should handle null in left expression correctly

SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness
issue in the case of overlapping partition and data columns

SPARK-39061: Set nullable correctly for Inline output attributes

SPARK-39887: RemoveRedundantAliases should keep aliases that make the
output of projection nodes unique

SPARK-38614: Don't push down limit through window that's using percent_rank


Re: Time for Spark 3.3.1 release?

2022-09-12 Thread Dongjoon Hyun
+1

Thanks,
Dongjoon.

On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang  wrote:

> Hi, All.
>
>
>
> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches including
> 7 correctness patches arrived at branch-3.3.
>
>
>
> Shall we make a new release, Apache Spark 3.3.1, as the second release at
> branch-3.3? I'd like to volunteer as the release manager for Apache Spark
> 3.3.1.
>
>
>
> All changes:
>
> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
>
>
>
> Correctness issues:
>
> SPARK-40149: Propagate metadata columns through Project
>
> SPARK-40002: Don't push down limit through window using ntile
>
> SPARK-39976: ArrayIntersect should handle null in left expression
> correctly
>
> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness
> issue in the case of overlapping partition and data columns
>
> SPARK-39061: Set nullable correctly for Inline output attributes
>
> SPARK-39887: RemoveRedundantAliases should keep aliases that make the
> output of projection nodes unique
>
> SPARK-38614: Don't push down limit through window that's using
> percent_rank
>


Re: Time for Spark 3.3.1 release?

2022-09-12 Thread L. C. Hsieh
+1

Thanks Yuming!

On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun  wrote:
>
> +1
>
> Thanks,
> Dongjoon.
>
> On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang  wrote:
>>
>> Hi, All.
>>
>>
>>
>> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches including 7 
>> correctness patches arrived at branch-3.3.
>>
>>
>>
>> Shall we make a new release, Apache Spark 3.3.1, as the second release at 
>> branch-3.3? I'd like to volunteer as the release manager for Apache Spark 
>> 3.3.1.
>>
>>
>>
>> All changes:
>>
>> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
>>
>>
>>
>> Correctness issues:
>>
>> SPARK-40149: Propagate metadata columns through Project
>>
>> SPARK-40002: Don't push down limit through window using ntile
>>
>> SPARK-39976: ArrayIntersect should handle null in left expression correctly
>>
>> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness issue 
>> in the case of overlapping partition and data columns
>>
>> SPARK-39061: Set nullable correctly for Inline output attributes
>>
>> SPARK-39887: RemoveRedundantAliases should keep aliases that make the output 
>> of projection nodes unique
>>
>> SPARK-38614: Don't push down limit through window that's using percent_rank

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Time for Spark 3.3.1 release?

2022-09-12 Thread Gengliang Wang
+1.
Thank you, Yuming!

On Mon, Sep 12, 2022 at 12:10 PM L. C. Hsieh  wrote:

> +1
>
> Thanks Yuming!
>
> On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thanks,
> > Dongjoon.
> >
> > On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang  wrote:
> >>
> >> Hi, All.
> >>
> >>
> >>
> >> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches
> including 7 correctness patches arrived at branch-3.3.
> >>
> >>
> >>
> >> Shall we make a new release, Apache Spark 3.3.1, as the second release
> at branch-3.3? I'd like to volunteer as the release manager for Apache
> Spark 3.3.1.
> >>
> >>
> >>
> >> All changes:
> >>
> >> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
> >>
> >>
> >>
> >> Correctness issues:
> >>
> >> SPARK-40149: Propagate metadata columns through Project
> >>
> >> SPARK-40002: Don't push down limit through window using ntile
> >>
> >> SPARK-39976: ArrayIntersect should handle null in left expression
> correctly
> >>
> >> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness
> issue in the case of overlapping partition and data columns
> >>
> >> SPARK-39061: Set nullable correctly for Inline output attributes
> >>
> >> SPARK-39887: RemoveRedundantAliases should keep aliases that make the
> output of projection nodes unique
> >>
> >> SPARK-38614: Don't push down limit through window that's using
> percent_rank
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Time for Spark 3.3.1 release?

2022-09-12 Thread Hyukjin Kwon
+1

On Tue, 13 Sept 2022 at 06:45, Gengliang Wang  wrote:

> +1.
> Thank you, Yuming!
>
> On Mon, Sep 12, 2022 at 12:10 PM L. C. Hsieh  wrote:
>
>> +1
>>
>> Thanks Yuming!
>>
>> On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun 
>> wrote:
>> >
>> > +1
>> >
>> > Thanks,
>> > Dongjoon.
>> >
>> > On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang  wrote:
>> >>
>> >> Hi, All.
>> >>
>> >>
>> >>
>> >> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches
>> including 7 correctness patches arrived at branch-3.3.
>> >>
>> >>
>> >>
>> >> Shall we make a new release, Apache Spark 3.3.1, as the second release
>> at branch-3.3? I'd like to volunteer as the release manager for Apache
>> Spark 3.3.1.
>> >>
>> >>
>> >>
>> >> All changes:
>> >>
>> >> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
>> >>
>> >>
>> >>
>> >> Correctness issues:
>> >>
>> >> SPARK-40149: Propagate metadata columns through Project
>> >>
>> >> SPARK-40002: Don't push down limit through window using ntile
>> >>
>> >> SPARK-39976: ArrayIntersect should handle null in left expression
>> correctly
>> >>
>> >> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness
>> issue in the case of overlapping partition and data columns
>> >>
>> >> SPARK-39061: Set nullable correctly for Inline output attributes
>> >>
>> >> SPARK-39887: RemoveRedundantAliases should keep aliases that make the
>> output of projection nodes unique
>> >>
>> >> SPARK-38614: Don't push down limit through window that's using
>> percent_rank
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: Time for Spark 3.3.1 release?

2022-09-12 Thread Yang,Jie(INF)
+1

Thanks Yuming ~

发件人: Hyukjin Kwon 
日期: 2022年9月13日 星期二 08:19
收件人: Gengliang Wang 
抄送: "L. C. Hsieh" , Dongjoon Hyun , 
Yuming Wang , dev 
主题: Re: Time for Spark 3.3.1 release?

+1

On Tue, 13 Sept 2022 at 06:45, Gengliang Wang 
mailto:ltn...@gmail.com>> wrote:
+1.
Thank you, Yuming!

On Mon, Sep 12, 2022 at 12:10 PM L. C. Hsieh 
mailto:vii...@gmail.com>> wrote:
+1

Thanks Yuming!

On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun 
mailto:dongjoon.h...@gmail.com>> wrote:
>
> +1
>
> Thanks,
> Dongjoon.
>
> On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang 
> mailto:wgy...@gmail.com>> wrote:
>>
>> Hi, All.
>>
>>
>>
>> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches including 7 
>> correctness patches arrived at branch-3.3.
>>
>>
>>
>> Shall we make a new release, Apache Spark 3.3.1, as the second release at 
>> branch-3.3? I'd like to volunteer as the release manager for Apache Spark 
>> 3.3.1.
>>
>>
>>
>> All changes:
>>
>> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
>>
>>
>>
>> Correctness issues:
>>
>> SPARK-40149: Propagate metadata columns through Project
>>
>> SPARK-40002: Don't push down limit through window using ntile
>>
>> SPARK-39976: ArrayIntersect should handle null in left expression correctly
>>
>> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness issue 
>> in the case of overlapping partition and data columns
>>
>> SPARK-39061: Set nullable correctly for Inline output attributes
>>
>> SPARK-39887: RemoveRedundantAliases should keep aliases that make the output 
>> of projection nodes unique
>>
>> SPARK-38614: Don't push down limit through window that's using percent_rank

-
To unsubscribe e-mail: 
dev-unsubscr...@spark.apache.org


Re: Time for Spark 3.3.1 release?

2022-09-12 Thread John Zhuge
+1

On Mon, Sep 12, 2022 at 9:08 PM Yang,Jie(INF)  wrote:

> +1
>
>
>
> Thanks Yuming ~
>
>
>
> *发件人**: *Hyukjin Kwon 
> *日期**: *2022年9月13日 星期二 08:19
> *收件人**: *Gengliang Wang 
> *抄送**: *"L. C. Hsieh" , Dongjoon Hyun <
> dongjoon.h...@gmail.com>, Yuming Wang , dev <
> dev@spark.apache.org>
> *主题**: *Re: Time for Spark 3.3.1 release?
>
>
>
> +1
>
>
>
> On Tue, 13 Sept 2022 at 06:45, Gengliang Wang  wrote:
>
> +1.
>
> Thank you, Yuming!
>
>
>
> On Mon, Sep 12, 2022 at 12:10 PM L. C. Hsieh  wrote:
>
> +1
>
> Thanks Yuming!
>
> On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thanks,
> > Dongjoon.
> >
> > On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang  wrote:
> >>
> >> Hi, All.
> >>
> >>
> >>
> >> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches
> including 7 correctness patches arrived at branch-3.3.
> >>
> >>
> >>
> >> Shall we make a new release, Apache Spark 3.3.1, as the second release
> at branch-3.3? I'd like to volunteer as the release manager for Apache
> Spark 3.3.1.
> >>
> >>
> >>
> >> All changes:
> >>
> >> https://github.com/apache/spark/compare/v3.3.0...branch-3.3
> 
> >>
> >>
> >>
> >> Correctness issues:
> >>
> >> SPARK-40149: Propagate metadata columns through Project
> >>
> >> SPARK-40002: Don't push down limit through window using ntile
> >>
> >> SPARK-39976: ArrayIntersect should handle null in left expression
> correctly
> >>
> >> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness
> issue in the case of overlapping partition and data columns
> >>
> >> SPARK-39061: Set nullable correctly for Inline output attributes
> >>
> >> SPARK-39887: RemoveRedundantAliases should keep aliases that make the
> output of projection nodes unique
> >>
> >> SPARK-38614: Don't push down limit through window that's using
> percent_rank
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
> --
John Zhuge