Hi everyone, I’d like to bring up the topic of deprecating Spark 3.4 in an upcoming release. Anton initially suggested this during our previous dev list discussion <https://lists.apache.org/thread/t0b5dgk1brjx5vs8mogzm2g6kt3byly2> about maintaining feature parity across the Spark versions we support for 1.10.
Currently, we support two different Spark 3.x versions, 3.4 and 3.5. Spark 3.4’s last maintenance release was in October 2024 <https://spark.apache.org/releases/spark-release-3-4-4.html>, and it is now considered end-of-life <https://endoflife.date/apache-spark>. What are your thoughts on marking Spark 3.4 as deprecated in 1.11 and removing it in 1.12? For reference, here's the previous discussion thread <https://lists.apache.org/thread/plhxdxjty2w3gdg2fzg5dvvv28y20n3g> on deprecating Spark 3.3. Best, Kevin Liu