+1 (non-binding) pending the discussion on CVEs (not the performance
improvement) in the current version of Apache Parquet.

On Tue, May 27, 2025 at 11:19 AM L. C. Hsieh <vii...@gmail.com> wrote:

> +1
>
> On Mon, May 26, 2025 at 6:51 PM Wenchen Fan <cloud0...@gmail.com> wrote:
> >
> > +1. When this release is out, let's also update the release process
> document to introduce the new way of making releases with GitHub Action
> jobs.
> >
> > On Tue, May 27, 2025 at 6:22 AM Dongjoon Hyun <dongj...@apache.org>
> wrote:
> >>
> >> +1 from my side.
> >>
> >> Thank you, Hyukjin.
> >>
> >> Dongjoon
> >>
> >> On 2025/05/26 22:19:22 Hyukjin Kwon wrote:
> >> > Thanks guys. BTW for clarification, this is the preparation of more
> >> > frequent releases so we don't have to wait so long for each release.
> Let's
> >> > prepare this first, and roll it faster
> >> >
> >> > On Tue, 27 May 2025 at 01:52, Yang Jie <yangji...@apache.org> wrote:
> >> >
> >> > > +1
> >> > >
> >> > > On 2025/05/26 01:10:23 Hyukjin Kwon wrote:
> >> > > > The key issue was fixed.
> >> > > >
> >> > > > On Mon, 26 May 2025 at 10:05, Hyukjin Kwon <gurwls...@apache.org>
> wrote:
> >> > > >
> >> > > > > Probably should avoid backporting it for improvements but If
> there is a
> >> > > > > CVE that directly affects Spark, let's upgrade.
> >> > > > >
> >> > > > > On Mon, 26 May 2025 at 00:27, Rozov, Vlad
> <vro...@amazon.com.invalid>
> >> > > > > wrote:
> >> > > > >
> >> > > > >> Should parquet version be upgraded to 1.15.1 or 1.15.2? There
> are 10
> >> > > CVEs
> >> > > > >> in the current 1.13.1 and even though they may not impact
> Spark there
> >> > > are
> >> > > > >> other improvements (better performance) that will benefit
> Spark users.
> >> > > > >>
> >> > > > >> Thank you,
> >> > > > >>
> >> > > > >> Vlad
> >> > > > >>
> >> > > > >> On May 24, 2025, at 8:02 PM, Hyukjin Kwon <
> gurwls...@apache.org>
> >> > > wrote:
> >> > > > >>
> >> > > > >> Oh let me check. Thanks for letting me know.
> >> > > > >>
> >> > > > >> On Sun, May 25, 2025 at 12:00 PM Dongjoon Hyun <
> dongj...@apache.org>
> >> > > > >> wrote:
> >> > > > >>
> >> > > > >>> I saw 38 commits to make this work. Thank you for driving
> this,
> >> > > Hyukjin.
> >> > > > >>>
> >> > > > >>> BTW, your key seems to be new and is not in
> >> > > > >>> https://dist.apache.org/repos/dist/dev/spark/KEYS yet. Could
> you
> >> > > > >>> double-check?
> >> > > > >>>
> >> > > > >>> $ curl -LO https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> > > > >>> $ gpg --import KEYS
> >> > > > >>> $ gpg --verify spark-3.5.6-bin-hadoop3.tgz.asc
> >> > > > >>> gpg: assuming signed data in 'spark-3.5.6-bin-hadoop3.tgz'
> >> > > > >>> gpg: Signature made Thu May 22 23:49:54 2025 PDT
> >> > > > >>> gpg:                using RSA key
> >> > > > >>> 0FE4571297AB84440673665669600C8338F65970
> >> > > > >>> gpg:                issuer "gurwls...@apache.org"
> >> > > > >>> gpg: Can't check signature: No public key
> >> > > > >>>
> >> > > > >>> Dongjoon.
> >> > > > >>>
> >> > > > >>> On 2025/05/23 17:56:25 Allison Wang wrote:
> >> > > > >>> > +1
> >> > > > >>> >
> >> > > > >>> > On Fri, May 23, 2025 at 10:15 AM Hyukjin Kwon <
> >> > > gurwls...@apache.org>
> >> > > > >>> wrote:
> >> > > > >>> >
> >> > > > >>> > > Oh it's actually a test and also to release. Let me know
> if you
> >> > > have
> >> > > > >>> any
> >> > > > >>> > > concern!
> >> > > > >>> > >
> >> > > > >>> > > On Fri, May 23, 2025 at 11:25 PM Mridul Muralidharan <
> >> > > > >>> mri...@gmail.com>
> >> > > > >>> > > wrote:
> >> > > > >>> > >
> >> > > > >>> > >> Hi Hyukjin,
> >> > > > >>> > >>
> >> > > > >>> > >>   This thread is to test the automated release, right ?
> >> > > > >>> > >> Not to actually release it ?
> >> > > > >>> > >>
> >> > > > >>> > >> Regards,
> >> > > > >>> > >> Mridul
> >> > > > >>> > >>
> >> > > > >>> > >> On Fri, May 23, 2025 at 8:26 AM Ruifeng Zheng <
> >> > > ruife...@apache.org>
> >> > > > >>> > >> wrote:
> >> > > > >>> > >>
> >> > > > >>> > >>> +1
> >> > > > >>> > >>>
> >> > > > >>> > >>> On Fri, May 23, 2025 at 5:27 PM Hyukjin Kwon <
> >> > > gurwls...@apache.org
> >> > > > >>> >
> >> > > > >>> > >>> wrote:
> >> > > > >>> > >>>
> >> > > > >>> > >>>> Please vote on releasing the following candidate as
> Apache
> >> > > Spark
> >> > > > >>> > >>>> version 3.5.6.
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The vote is open until May 27 (PST)  and passes if a
> majority
> >> > > +1
> >> > > > >>> PMC
> >> > > > >>> > >>>> votes are cast, with
> >> > > > >>> > >>>> a minimum of 3 +1 votes.
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> [ ] +1 Release this package as Apache Spark 3.5.6
> >> > > > >>> > >>>> [ ] -1 Do not release this package because ...
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> To learn more about Apache Spark, please see
> >> > > > >>> https://spark.apache.org/
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The tag to be voted on is v3.5.6-rc5 (commit
> >> > > > >>> > >>>> 303c18c74664f161b9b969ac343784c088b47593):
> >> > > > >>> > >>>>
> >> > > > >>> > >>>>
> >> > > > >>>
> >> > >
> https://github.com/apache/spark/tree/303c18c74664f161b9b969ac343784c088b47593
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The release files, including signatures, digests, etc.
> can be
> >> > > > >>> found at:
> >> > > > >>> > >>>>
> https://dist.apache.org/repos/dist/dev/spark/v3.5.6-rc1-bin/
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> Signatures used for Spark RCs can be found in this
> file:
> >> > > > >>> > >>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The staging repository for this release can be found
> at:
> >> > > > >>> > >>>>
> >> > > > >>>
> >> > >
> https://repository.apache.org/content/repositories/orgapachespark-1495/
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The documentation corresponding to this release can be
> found
> >> > > at:
> >> > > > >>> > >>>>
> https://dist.apache.org/repos/dist/dev/spark/v3.5.6-rc1-docs/
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> The list of bug fixes going into 3.5.6 can be found at
> the
> >> > > > >>> following
> >> > > > >>> > >>>> URL:
> >> > > > >>> > >>>>
> >> > > https://issues.apache.org/jira/projects/SPARK/versions/12355703
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> FAQ
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> =========================
> >> > > > >>> > >>>> How can I help test this release?
> >> > > > >>> > >>>> =========================
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> If you are a Spark user, you can help us test this
> release by
> >> > > > >>> taking
> >> > > > >>> > >>>> an existing Spark workload and running on this release
> >> > > candidate,
> >> > > > >>> then
> >> > > > >>> > >>>> reporting any regressions.
> >> > > > >>> > >>>>
> >> > > > >>> > >>>> If you're working in PySpark you can set up a virtual
> env and
> >> > > > >>> install
> >> > > > >>> > >>>> the current RC via "pip install
> >> > > > >>> > >>>>
> >> > > > >>>
> >> > >
> https://dist.apache.org/repos/dist/dev/spark/v3.5.6-rc1-bin/pyspark-3.5.6.tar.gz
> >> > > > >>> > >>>> "
> >> > > > >>> > >>>> and see if anything important breaks.
> >> > > > >>> > >>>> In the Java/Scala, you can add the staging repository
> to your
> >> > > > >>> projects
> >> > > > >>> > >>>> resolvers and test
> >> > > > >>> > >>>> with the RC (make sure to clean up the artifact cache
> >> > > > >>> before/after so
> >> > > > >>> > >>>> you don't end up building with a out of date RC going
> >> > > forward).
> >> > > > >>> > >>>>
> >> > > > >>> > >>>
> >> > > > >>> >
> >> > > > >>>
> >> > > > >>>
> ---------------------------------------------------------------------
> >> > > > >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >> > > > >>>
> >> > > > >>>
> >> > > > >>
> >> > > >
> >> > >
> >> > >
> ---------------------------------------------------------------------
> >> > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >> > >
> >> > >
> >> >
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to