Hi all,

I'm integrating Spark 4.1(preview2-rc1) with Iceberg [1] and have to change
target byte code level from 11 to 17. Otherwise, Scala compile would fail
with following error. It compiled fine with preview1.

"spark/v4.1/spark/src/main/scala/org/apache/spark/sql/stats/ThetaSketchAgg.scala:52:12:
Class java.lang.Record not found"

May I know which change could have caused this between preview1 and
preview2-rc1?

1. https://github.com/apache/iceberg/pull/14155

Thanks,
Manu


On Fri, Sep 26, 2025 at 8:54 AM Szehon Ho <[email protected]> wrote:

> +1 (non-binding).
>
> Checked signature, checksum, basic sql query.
>
> Thanks
> Szehon
>
> On Thu, Sep 25, 2025 at 8:31 AM Rozov, Vlad <[email protected]>
> wrote:
>
>> +1 (non-binding)
>>
>>
>>
>> Thank you,
>>
>>
>>
>> Vlad
>>
>>
>>
>> *From: *Peter Toth <[email protected]>
>> *Date: *Thursday, September 25, 2025 at 7:14 AM
>> *To: *"[email protected]" <[email protected]>
>> *Subject: *RE: [VOTE] Release Spark 4.1.0-preview2 (RC1)
>>
>>
>>
>> +1 (non-binding)
>>
>>
>>
>> On Thu, Sep 25, 2025 at 7:37 AM Yuming Wang <[email protected]> wrote:
>>
>> +1
>>
>>
>>
>> On Thu, Sep 25, 2025 at 10:34 AM Yang Jie <[email protected]> wrote:
>>
>> +1
>>
>> Thank you Hyukjin.
>>
>> On 2025/09/25 02:17:10 Jungtaek Lim wrote:
>> > +1 (non-binding)
>> >
>> > Thanks Hyukjin!
>> >
>> > On Thu, Sep 25, 2025 at 3:36 AM John Zhuge <[email protected]> wrote:
>> >
>> > > +1 Thanks Hyukjin!
>> > >
>> > > On Wed, Sep 24, 2025 at 10:53 AM huaxin gao <[email protected]>
>> > > wrote:
>> > >
>> > >> +1 Thanks Hyukjin for driving the release!
>> > >>
>> > >> On Wed, Sep 24, 2025 at 9:45 AM L. C. Hsieh <[email protected]>
>> wrote:
>> > >>
>> > >>> +1
>> > >>>
>> > >>> Thanks Hyukjin.
>> > >>>
>> > >>> On Wed, Sep 24, 2025 at 9:18 AM Dongjoon Hyun <[email protected]>
>> > >>> wrote:
>> > >>> >
>> > >>> > +1
>> > >>> >
>> > >>> > Thank you, Hyukjin.
>> > >>> >
>> > >>> > Dongjoon
>> > >>> >
>> > >>> > On 2025/09/24 12:48:48 Wenchen Fan wrote:
>> > >>> > > +1
>> > >>> > >
>> > >>> > > On Wed, Sep 24, 2025 at 7:29 PM <[email protected]> wrote:
>> > >>> > >
>> > >>> > > > Please vote on releasing the following candidate as Apache
>> Spark
>> > >>> version
>> > >>> > > > 4.1.0-preview2.
>> > >>> > > >
>> > >>> > > > The vote is open until Sat, 27 Sep 2025 05:26:22 PDT and
>> passes if
>> > >>> a
>> > >>> > > > majority +1 PMC votes are cast, with
>> > >>> > > > a minimum of 3 +1 votes.
>> > >>> > > >
>> > >>> > > > [ ] +1 Release this package as Apache Spark 4.1.0-preview2
>> > >>> > > > [ ] -1 Do not release this package because ...
>> > >>> > > >
>> > >>> > > > To learn more about Apache Spark, please see
>> > >>> https://spark.apache.org/
>> > >>> > > >
>> > >>> > > > The tag to be voted on is v4.1.0-preview2-rc1 (commit
>> c5ff48cc2b2):
>> > >>> > > > https://github.com/apache/spark/tree/v4.1.0-preview2-rc1
>> > >>> > > >
>> > >>> > > > The release files, including signatures, digests, etc. can be
>> > >>> found at:
>> > >>> > > >
>> > >>>
>> https://dist.apache.org/repos/dist/dev/spark/v4.1.0-preview2-rc1-bin/
>> > >>> > > >
>> > >>> > > > Signatures used for Spark RCs can be found in this file:
>> > >>> > > > https://downloads.apache.org/spark/KEYS
>> > >>> > > >
>> > >>> > > > The staging repository for this release can be found at:
>> > >>> > > >
>> > >>>
>> https://repository.apache.org/content/repositories/orgapachespark-1503/
>> > >>> > > >
>> > >>> > > > The documentation corresponding to this release can be found
>> at:
>> > >>> > > >
>> > >>>
>> https://dist.apache.org/repos/dist/dev/spark/v4.1.0-preview2-rc1-docs/
>> > >>> > > >
>> > >>> > > > The list of bug fixes going into 4.1.0-preview2 can be found
>> at the
>> > >>> > > > following URL:
>> > >>> > > >
>> https://issues.apache.org/jira/projects/SPARK/versions/12355581
>> > >>> > > >
>> > >>> > > > FAQ
>> > >>> > > >
>> > >>> > > > =========================
>> > >>> > > > How can I help test this release?
>> > >>> > > > =========================
>> > >>> > > >
>> > >>> > > > If you are a Spark user, you can help us test this release by
>> > >>> taking
>> > >>> > > > an existing Spark workload and running on this release
>> candidate,
>> > >>> then
>> > >>> > > > reporting any regressions.
>> > >>> > > >
>> > >>> > > > If you're working in PySpark you can set up a virtual env and
>> > >>> install
>> > >>> > > > the current RC via "pip install
>> > >>> > > >
>> > >>>
>> https://dist.apache.org/repos/dist/dev/spark/v4.1.0-preview2-rc1-bin/pyspark-4.1.0.dev2.tar.gz
>> > >>> > > > "
>> > >>> > > > and see if anything important breaks.
>> > >>> > > > In the Java/Scala, you can add the staging repository to your
>> > >>> project's
>> > >>> > > > resolvers and test
>> > >>> > > > with the RC (make sure to clean up the artifact cache
>> before/after
>> > >>> so
>> > >>> > > > you don't end up building with an out of date RC going
>> forward).
>> > >>> > > >
>> > >>> > > >
>> > >>>
>> ---------------------------------------------------------------------
>> > >>> > > > To unsubscribe e-mail: [email protected]
>> > >>> > > >
>> > >>> > > >
>> > >>> > >
>> > >>> >
>> > >>> >
>> ---------------------------------------------------------------------
>> > >>> > To unsubscribe e-mail: [email protected]
>> > >>> >
>> > >>>
>> > >>>
>> ---------------------------------------------------------------------
>> > >>> To unsubscribe e-mail: [email protected]
>> > >>>
>> > >>>
>> > >
>> > > --
>> > > John Zhuge
>> > >
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [email protected]
>>
>>

Reply via email to