[
https://issues.apache.org/jira/browse/HUDI-2426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17472388#comment-17472388
]
Raymond Xu commented on HUDI-2426:
----------------------------------
Also verified the issue not in 0.10.1 version with spark 3.1 and 3.0
> spark sql extensions breaks read.table from metastore
> -----------------------------------------------------
>
> Key: HUDI-2426
> URL: https://issues.apache.org/jira/browse/HUDI-2426
> Project: Apache Hudi
> Issue Type: Bug
> Components: Spark Integration
> Reporter: nicolas paris
> Assignee: Yann Byron
> Priority: Critical
> Labels: sev:critical, user-support-issues
> Fix For: 0.10.1
>
>
> when adding the hudi spark sql support, this breaks the ability to read a
> hudi metastore from spark:
> bash-4.2$ ./spark3.0.2/bin/spark-shell --packages
> org.apache.hudi:hudi-spark3-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.1.2
> --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf
> 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
>
> scala> spark.table("default.test_hudi_table").show
> java.lang.UnsupportedOperationException: Unsupported parseMultipartIdentifier
> method
> at
> org.apache.spark.sql.parser.HoodieCommonSqlParser.parseMultipartIdentifier(HoodieCommonSqlParser.scala:65)
> at org.apache.spark.sql.SparkSession.table(SparkSession.scala:581)
> ... 47 elided
>
> removing the config makes the hive table readable again from spark
> this affect at least spark 3.0.x and 3.1.x
--
This message was sent by Atlassian Jira
(v8.20.1#820001)