+1

Thanks Yuming ~

发件人: Hyukjin Kwon <gurwls...@gmail.com>
日期: 2022年9月13日 星期二 08:19
收件人: Gengliang Wang <ltn...@gmail.com>
抄送: "L. C. Hsieh" <vii...@gmail.com>, Dongjoon Hyun <dongjoon.h...@gmail.com>, 
Yuming Wang <wgy...@gmail.com>, dev <dev@spark.apache.org>
主题: Re: Time for Spark 3.3.1 release?

+1

On Tue, 13 Sept 2022 at 06:45, Gengliang Wang 
<ltn...@gmail.com<mailto:ltn...@gmail.com>> wrote:
+1.
Thank you, Yuming!

On Mon, Sep 12, 2022 at 12:10 PM L. C. Hsieh 
<vii...@gmail.com<mailto:vii...@gmail.com>> wrote:
+1

Thanks Yuming!

On Mon, Sep 12, 2022 at 11:50 AM Dongjoon Hyun 
<dongjoon.h...@gmail.com<mailto:dongjoon.h...@gmail.com>> wrote:
>
> +1
>
> Thanks,
> Dongjoon.
>
> On Mon, Sep 12, 2022 at 6:38 AM Yuming Wang 
> <wgy...@gmail.com<mailto:wgy...@gmail.com>> wrote:
>>
>> Hi, All.
>>
>>
>>
>> Since Apache Spark 3.3.0 tag creation (Jun 10), new 138 patches including 7 
>> correctness patches arrived at branch-3.3.
>>
>>
>>
>> Shall we make a new release, Apache Spark 3.3.1, as the second release at 
>> branch-3.3? I'd like to volunteer as the release manager for Apache Spark 
>> 3.3.1.
>>
>>
>>
>> All changes:
>>
>> https://github.com/apache/spark/compare/v3.3.0...branch-3.3<https://mailshield.baidu.com/check?q=WzRnV6InLAPdBDRyJZecGtPwF02t%2bnFNwOI8oAyGcb60kX%2bRCS6N3SUnFjTdf47bb94KmZHTTKE%2bBHbIT27Rog%3d%3d>
>>
>>
>>
>> Correctness issues:
>>
>> SPARK-40149: Propagate metadata columns through Project
>>
>> SPARK-40002: Don't push down limit through window using ntile
>>
>> SPARK-39976: ArrayIntersect should handle null in left expression correctly
>>
>> SPARK-39833: Disable Parquet column index in DSv1 to fix a correctness issue 
>> in the case of overlapping partition and data columns
>>
>> SPARK-39061: Set nullable correctly for Inline output attributes
>>
>> SPARK-39887: RemoveRedundantAliases should keep aliases that make the output 
>> of projection nodes unique
>>
>> SPARK-38614: Don't push down limit through window that's using percent_rank

---------------------------------------------------------------------
To unsubscribe e-mail: 
dev-unsubscr...@spark.apache.org<mailto:dev-unsubscr...@spark.apache.org>

Reply via email to