Hi Ryan,

Yes it makes sense. The way we discuss and decide the Spark versions
is totally fine.

My proposal was more to clearly announce the Spark/Flink/Java/Python
versions supported by Iceberg releases. I know that it's obvious on
the artifacts name (containing the spark/flink versions) as we share
on https://iceberg.apache.org/releases/.
The idea is just to anticipate a bit to inform our users/community,
for instance having a clear table about the supported layers (a bit
like on https://karaf.apache.org/download.html or
https://kafka.apache.org/downloads).

Thanks !
Regards
JB

On Thu, Sep 21, 2023 at 5:40 PM Ryan Blue <b...@tabular.io> wrote:
>
> JB, I don't think that we need a policy on which Spark versions we intend to 
> keep. Having discussions like this are more effective. Just look at the 
> support for Spark 2.4, which we kept for a lot longer to help people 
> transition. Policy is a way of making decisions by algorithm and I don't 
> think we want to do that here.
>
> On Thu, Sep 21, 2023 at 1:48 AM Jean-Baptiste Onofré <j...@nanthrax.net> 
> wrote:
>>
>> Just to elaborate a bit :)
>>
>> - As Iceberg 1.4.0 is new "major" release, it's good time to
>> deprecate/remove old version support (of Spark and other things)
>> - Spark 3.2 users can still use previous Iceberg version
>> - I will start a discussion about LTS policy with a clear "target"
>> support for our users (something like the table you can see here
>> https://karaf.apache.org/download.html), we can list supported Java,
>> Python, Spark, Flink, .... support
>>
>> Regards
>> JB
>>
>> On Thu, Sep 21, 2023 at 12:01 AM Anton Okolnychyi
>> <aokolnyc...@apple.com.invalid> wrote:
>> >
>> > Shall we consider deprecating our Spark 3.2 support? That Spark version is 
>> > no longer being maintained by the Spark community and is not under active 
>> > development in Iceberg. It was released in October, 2021 and passed the 18 
>> > month maintenance mark in Spark.
>> >
>> > - Anton
>
>
>
> --
> Ryan Blue
> Tabular

Reply via email to