Not an easy task, I guess, but I'm totally for it too.

The issue SPARK-49910 <https://issues.apache.org/jira/browse/SPARK-49910>
is related to this.

El mar, 11 mar 2025 a las 23:06, Mich Talebzadeh (<mich.talebza...@gmail.com>)
escribió:

> Yes I am all  for it, as I use Hive with Oracle as its metastore
> extensively.
>
> Case in point, on 6th March A Hive user
> <https://lists.apache.org/thread/vhgxt1cj2ppc862j0lwxl63j6nfc7khh>
> alluded to it and I quote
>
> "I just wanted to highlight that Hive 3.x line is EOL. It has various
> known security vulnerabilities, many serious bugs (including wrong results
> and data corruption), and lacks lots of improvements and major features
> that are available in Hive 4. Upgrading is the right path forward."
>
>  In summary, Hive 4.x likely includes performance improvements, new
> features, and bug fixes. Compiling against it would allow Spark to take
> advantage of these. Plus using the latest versions of both Spark and Hive
> is important for maintaining a secure data platform.
>
> HTH
>
> Dr Mich Talebzadeh,
> Architect | Data Science | Financial Crime | Forensic Analysis | GDPR
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>
>
>
> On Tue, 11 Mar 2025 at 19:08, Rozov, Vlad <vro...@amazon.com.invalid>
> wrote:
>
>> Hi All,
>>
>> As Apache Hive announced EOL for Hive 2.x [1] and 3.x [2], should Spark
>> be compiled against Hive 4.x and use it as default?
>>
>> Thank you,
>>
>> Vlad
>>
>> [1] https://lists.apache.org/thread/4ctrzfw60jkhc0hq2xoh1jpqxgt2zd93
>> [2] https://lists.apache.org/thread/99h6wr7nk4684r6tkcbm8ydfytgqy6f3
>> [3] https://github.com/apache/spark/pull/50213
>>
>

Reply via email to