Hello all,
I am trying to setup a Spark interpreter who hits YARN on Zeppelin. We have
both HDP and Standalone Spark running.
Stand alone Spark works fine but fails if interpreter is set to yarn master
(YARN log):
Error: Could not find or load main class
org.apache.spark.deploy.yarn.ExecutorL
Thanks, I can run spark-submit correctly and the PySpark Console, and
spark-shell. I'll do some more digging.
On Sun, Dec 11, 2016 at 4:43 PM, Jianfeng (Jeff) Zhang <
jzh...@hortonworks.com> wrote:
> Can you run spark-shell correctly ?
> If yes, then I suspect you may set them in zeppelin-side, g
Can you check the interpreter log ? Zeppelin has 2 kinds of log, one is for
zeppelin server (you pasted above), another is the interpreter log.
Russell Jurney 于2016年12月14日周三 上午8:22写道:
> Thanks, I can run spark-submit correctly and the PySpark Console, and
> spark-shell. I'll do some more digging
Hello. Saif.
I believe pre-build Spark binary includes YARN.
Please refer to [0] for yarn setting, if doesn't work please share your
*conf/zeppelin-env.sh* and logs.
[0]
http://zeppelin.apache.org/docs/0.7.0-SNAPSHOT/install/spark_cluster_mode.html#4-configure-spark-interpreter-in-zeppelin
2016-
Hi Saif,
HDP works well for me. What version of HDP and zeppelin do you use. And
could you paste the full interpreter log ?
Hyung Sung Shim 于2016年12月14日周三 上午8:36写道:
> Hello. Saif.
>
> I believe pre-build Spark binary includes YARN.
> Please refer to [0] for yarn setting, if doesn't work please