firecast commented on issue #954:  
org.apache.hudi.org.apache.hadoop_hive.metastore.api.NoSuchObjectException: 
<hivedb.tableName> table not found
URL: https://github.com/apache/incubator-hudi/issues/954#issuecomment-544157071
 
 
   I also faced the same issue although I was using a remote hive instance 
instead of the AWS Glue Data Catalog. A quick fix I did to fix the issue were 
the following
   1. 
https://github.com/apache/incubator-hudi/blob/ed745dfdbf254bfc2ec6d9c7baed8ccbf571abab/hudi-spark/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala#L169
 to 
   ```scala
   syncHive(basePath, fs, parameters, sqlContext)
   ```
   2. 
https://github.com/apache/incubator-hudi/blob/ed745dfdbf254bfc2ec6d9c7baed8ccbf571abab/hudi-spark/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala#L231
 to 
   ```scala
   private def syncHive(basePath: Path, fs: FileSystem, parameters: Map[String, 
String], sqlContext: SQLContext): Boolean = {
   ```
   3. Add the following lines before this line 
https://github.com/apache/incubator-hudi/blob/ed745dfdbf254bfc2ec6d9c7baed8ccbf571abab/hudi-spark/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala#L235
   ```scala
   val hiveMetastoreURIs = 
sqlContext.sparkSession.conf.get(ConfVars.METASTOREURIS.varname)
   hiveConf.setVar(ConfVars.METASTOREURIS, hiveMetastoreURIs)
   ```
   
   What this basically does is add the thrift URI you have set while creating 
the Spark Session to the Hive configuration. A temporary solution if anyone has 
a similar spark configuration as mine.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to