Thank you all for your new feedback.

Unfortunately, I am concluding this RC2 vote as not passed again because it 
didn't meet the requirements.

Dongjoon Hyun.

On 2025/12/02 02:25:07 Jules Damji wrote:
> +1 (non-binding)  
> 
> —
> 
> Sent from my iPhone
> 
> Pardon the dumb thumb typos :)
> 
>   
> 
> > On Dec 1, 2025, at 1:55 AM, Manu Zhang <[email protected]> wrote:  
> >  
> >
> 
> > 
> >
> > +1 (non-binding)
> >
> >  
> >
> >
> > All tests passed now for [Spark 4.1 support in
> > Iceberg](https://github.com/apache/iceberg/pull/14155).
> >
> >  
> >
> >
> 
> >> FYI, RC2 has 41 commits after RC2.
> 
> >
> > You mean "after RC1", right? ;)
> >
> >  
> >
> >
> > Regards,
> >
> > Manu
> >
> >  
> >
> >
> > On Mon, Dec 1, 2025 at 8:46 AM Dongjoon Hyun
> > <[[email protected]](mailto:[email protected])> wrote:  
> >
> >
> 
> >> I'll start with my +1 as a release manager.  
> >  
> >  FYI, RC2 has 41 commits after RC2.  
> >  
> >  $ git log --oneline v4.1.0-rc1...v4.1.0-rc2 | wc -l  
> >        41  
> >  
> >  Thank you in advance for all your feedback.  
> >  
> >  Dongjoon Hyun.  
> >  
> >  On 2025/12/01 00:42:51 [[email protected]](mailto:[email protected])
> > wrote:  
> >  > Please vote on releasing the following candidate as Apache Spark version
> > 4.1.0.  
> >  >  
> >  > The vote is open until Wed, 03 Dec 2025 17:42:51 PST and passes if a
> > majority +1 PMC votes are cast, with  
> >  > a minimum of 3 +1 votes.  
> >  >  
> >  > [ ] +1 Release this package as Apache Spark 4.1.0  
> >  > [ ] -1 Do not release this package because ...  
> >  >  
> >  > To learn more about Apache Spark, please see <https://spark.apache.org/> 
> >  
> >  >  
> >  > The tag to be voted on is v4.1.0-rc2 (commit 560c52d5dff):  
> >  > <https://github.com/apache/spark/tree/v4.1.0-rc2>  
> >  >  
> >  > The release files, including signatures, digests, etc. can be found at:  
> >  > <https://dist.apache.org/repos/dist/dev/spark/v4.1.0-rc2-bin/>  
> >  >  
> >  > Signatures used for Spark RCs can be found in this file:  
> >  > <https://downloads.apache.org/spark/KEYS>  
> >  >  
> >  > The staging repository for this release can be found at:  
> >  > 
> > <https://repository.apache.org/content/repositories/orgapachespark-1507/>  
> >  >  
> >  > The documentation corresponding to this release can be found at:  
> >  > <https://dist.apache.org/repos/dist/dev/spark/v4.1.0-rc2-docs/>  
> >  >  
> >  > The list of bug fixes going into 4.1.0 can be found at the following 
> > URL:  
> >  > <https://issues.apache.org/jira/projects/SPARK/versions/12355581>  
> >  >  
> >  > FAQ  
> >  >  
> >  > =========================  
> >  > How can I help test this release?  
> >  > =========================  
> >  >  
> >  > If you are a Spark user, you can help us test this release by taking  
> >  > an existing Spark workload and running on this release candidate, then  
> >  > reporting any regressions.  
> >  >  
> >  > If you're working in PySpark you can set up a virtual env and install  
> >  > the current RC via "pip install
> > <https://dist.apache.org/repos/dist/dev/spark/v4.1.0-rc2-bin/pyspark-4.1.0.tar.gz>"
> >   
> >  > and see if anything important breaks.  
> >  > In the Java/Scala, you can add the staging repository to your project's
> > resolvers and test  
> >  > with the RC (make sure to clean up the artifact cache before/after so  
> >  > you don't end up building with an out of date RC going forward).  
> >  >  
> >  > \---------------------------------------------------------------------  
> >  > To unsubscribe e-mail: [[email protected]](mailto:dev-
> > [email protected])  
> >  >  
> >  >  
> >  
> >  \---------------------------------------------------------------------  
> >  To unsubscribe e-mail: [[email protected]](mailto:dev-
> > [email protected])  
> >  
> >
> 
> 

---------------------------------------------------------------------
To unsubscribe e-mail: [email protected]

Reply via email to