Hi Joyan, Spark uses its own metastore. Using Ranger you need to use the Hive Metastore. For this you need to point to Hive Metastore and use HiveContext in your Spark Code.
Br, Dennis Von meinem iPhone gesendet > Am 23.11.2020 um 19:04 schrieb joyan sil <joyan....@gmail.com>: > > > Hi, > > We have ranger policies defined on the hive table and authorization works as > expected when we use hive cli and beeline. But when we access those hive > tables using spark-shell or spark-submit it does not work. > > Any suggestions to make Ranger work with Spark? > > Regards > Joyan