You should put hive-site.xml in SPARK_CONF_DIR, the can not find file bug
is due to a spark bug
https://issues.apache.org/jira/browse/SPARK-18160
https://issues.cloudera.org/browse/LIVY-298
I have 1 workaround for you:
You need to install spark in all the nodes and put hive-site.xml in
SPARK_CON
I know - this is driving me crazy. It was working fine and without me
touching any of it (zeppelin/livy/spark/yarn), it broke. And with no
errors in spark/yarn or livy. I see a warning in the livy log about the
hive-site.xml not being found. In the interpreter configuration I have
tried setting
I want to know if this is possible. Works great for a single user but in a
multi-user environment, we need more granular control on who can do what.
The readers permission is not useful, because the user cannot execute or
even change the display type
Please share your experience how you are us
Hi All,
I have the following code.
val ds = sparkSession.readStream()
.format("kafka")
.option("kafka.bootstrap.servers",bootstrapServers))
.option("subscribe", topicName)
.option("checkpointLocation", hdfsCheckPointDir)