Hi, Bjørn

Currently, I don't think we can consider upgrading Guava for the following 
reasons:

1. Although Spark 3.4.0 will no longer release the hadoop-2 distribution, but 
the build and testing process is still running. We need to keep it and will not 
consider upgrading Guava until it is really removed

2. In my impression, Hive 2.3 still dependency on the Guava 14.0.1, Someone has 
tried to solve this problem before, but it was not completed, and this is 
another issue we need to solve before upgrading Guava


YangJie


发件人: Bjørn Jørgensen <bjornjorgen...@gmail.com>
日期: 2022年11月6日 星期日 22:17
收件人: dev <dev@spark.apache.org>
主题: Upgrade guava to 31.1-jre and remove hadoop2

Hi, anyone that has tried to upgrade guava now after we stop supporting hadoop2?
And is there a plan for removing hadoop2 code from the code base?

Reply via email to