I think the thread owner's point is valid. The default use of the Hive
Metastore by Spark further gives credence to the importance of addressing
this Hive vulnerability to ensure the security and reliability of Spark
applications. I use Hive as the default metastore for Spark as well. Spark
relies heavily on the Hive Metastore for managing critical metadata, such
as table schemas, data locations, and access control, unless you are using
a platform like Databricks with a unified catalog. In summary, this
dependency makes it essential to address any vulnerabilities within the
Hive Metastore, as they can indirectly impact the security and stability of
Spark applications among other things

HTH

Mich Talebzadeh,
Architect | Data Science | Financial Crime | Forensic Analysis | GDPR

   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>





On Mon, 27 Jan 2025 at 13:37, Sean Owen <sro...@gmail.com> wrote:

> It looks like that affects Hive, and not the metastore. I do not see that
> it is relevant to Spark at first glance.
>
>
> On Mon, Jan 27, 2025 at 1:21 AM Balaji Sudharsanam V
> <balaji.sudharsa...@ibm.com.invalid> wrote:
>
>> Hi All,
>>
>> There is a vulnerability with ‘High’ severity found in the *Apache Spark
>> 3.x and 4.0.0 preview (2) releases,* with the hive-metastore-2.3.x.jar.
>> This is defined here, Apache Hive security bypass CVE-2021-34538
>> Vulnerability Report
>> <https://exchange.xforce.ibmcloud.com/vulnerabilities/231404>
>>
>>
>>
>> The recommendation is to use upgrade to the latest version of Apache
>> Hive (*3.1.3, 4.0 or later*), available from the Apache Web site.
>>
>>
>>
>> Can we expect this getting fixed in the Apache Spark 4.0 GA ?
>>
>> Thanks,
>>
>> Balaji
>>
>>
>>
>>
>

Reply via email to