Re:Re: [Spark SQL]use zstd, No enum constant parquet.hadoop.metadata.CompressionCodecName.ZSTD

2018-12-20 Thread 大啊
I think your hive table using CompressionCodecName, but your parquet-hadoop-bundle.jar in spark classpaths is not a correct version. At 2018-12-21 12:07:22, "Jiaan Geng" wrote: >I think your hive table using CompressionCodecName, but your >parquet-hadoop-bundle.jar in spark classpaths is

Re: [Spark SQL]use zstd, No enum constant parquet.hadoop.metadata.CompressionCodecName.ZSTD

2018-12-20 Thread Jiaan Geng
I think your hive table using CompressionCodecName, but your parquet-hadoop-bundle.jar in spark classpaths is not a correct version. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: us

[Spark SQL]use zstd, No enum constant parquet.hadoop.metadata.CompressionCodecName.ZSTD

2018-12-19 Thread 李斌松
Import parquet-hadoop-bundle jar. into the spark hive project When you compress data using zstd, you may load it preferentially from the parquet-hadoop-bundle, and you canundefinedt find the enum constant parquet.hadoop.metadata.CompressionCodecName.ZSTD > > 18/12/20 10:35:28 ERROR Executor: Excep