Hello Community,Could any of you shed some light on below questions please ? Sent from my iPhone On Apr 15, 2024, at 9:02 PM, Prem Sahoo <prem.re...@gmail.com> wrote:
Any specific reason spark does not support or community doesn't want to go to Parquet V2 , which is more optimized and read and write is too much faster (form other component which I am using) Spark will read data written with v2 encodings just fine. You just don't need to worry about making Spark produce v2. And you should probably also not produce v2 encodings from other systems.
oops but so spark does not support parquet V2 atm ?, as We have a use case where we need parquet V2 as one of our components uses Parquet V2 .
Hi Prem,
Parquet v1 is the default because v2 has not been finalized and adopted by the community. I highly recommend not using v2 encodings at this time.
Ryan
I am using spark 3.2.0 . but my spark package comes with parquet-mr 1.2.1 which writes in parquet version 1 not version version 2:(. so I was looking how to write in Parquet version2 ?
Sorry you have a point there. It was released in version 3.00. What version of spark are you using?
Thank you so much for the info! But do we have any release notes where it says spark2.4.0 onwards supports parquet version 2. I was under the impression Spark3.0 onwards it started supporting .
Well if I am correct, Parquet version 2 support was introduced in Spark version 2.4.0. Therefore, any version of Spark starting from 2.4.0 supports Parquet version 2. Assuming that you are using Spark version 2.4.0 or later, you should be able to take advantage of Parquet version 2 features.
HTH
Mich Talebzadeh, Technologist | Solutions Architect | Data Engineer | Generative AILondon
United Kingdom
view my Linkedin profile
https://en.everybodywiki.com/Mich_Talebzadeh
Disclaimer: The information provided is correct to the best of my knowledge but of course cannot be guaranteed . It is essential to note that, as with any advice, quote "one test result is worth one-thousand expert opinions (Werner Von Braun)".
Thank you for the information! I can use any version of parquet-mr to produce parquet file.
regarding 2nd question . Which version of spark is supporting parquet version 2? May I get the release notes where parquet versions are mentioned ?
Parquet-mr is a Java library that provides functionality for working with Parquet files with hadoop. It is therefore more geared towards working with Parquet files within the Hadoop ecosystem, particularly using MapReduce jobs. There is no definitive way to check exact compatible versions within the library itself. However, you can have a look at this
HTH
Mich Talebzadeh, Technologist | Solutions Architect | Data Engineer | Generative AILondon
United Kingdom
view my Linkedin profile
https://en.everybodywiki.com/Mich_Talebzadeh
Disclaimer: The information provided is correct to the best of my knowledge but of course cannot be guaranteed . It is essential to note that, as with any advice, quote "one test result is worth one-thousand expert opinions (Werner Von Braun)".
Hello Team, May I know how to check which version of parquet is supported by parquet-mr 1.2.1 ?
Which version of parquet-mr is supporting parquet version 2 (V2) ?
Which version of spark is supporting parquet version 2? May I get the release notes where parquet versions are mentioned ?
--
--
|