Hello. Saif.

I believe pre-build Spark binary includes YARN.
Please refer to [0] for yarn setting, if doesn't work please share your
*conf/zeppelin-env.sh* and logs.

[0]
http://zeppelin.apache.org/docs/0.7.0-SNAPSHOT/install/spark_cluster_mode.html#4-configure-spark-interpreter-in-zeppelin

2016-12-14 7:32 GMT+09:00 <saif.a.ell...@wellsfargo.com>:

> Hello all,
>
> I am trying to setup a Spark interpreter who hits YARN on Zeppelin. We
> have both HDP and Standalone Spark running.
>
> Stand alone Spark works fine but fails if interpreter is set to yarn
> master (YARN log):
>
> Error: Could not find or load main class org.apache.spark.deploy.yarn.
> ExecutorLauncher
>
> HDP Spark does not work at all when used within Zeppelin (zeppelin
> interpreter log) :
>
> ERROR [2016-12-13 14:25:54,971] ({pool-2-thread-2}
> Utils.java[instantiateClass]:80) -
> java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
> One of my questions is If pre-built Spark binary distribution includes
> YARN? To confirm whether the first error is caused by the lack of YARN
> libraries in SPARK
>
> Other than that, I am quite at loss in this one, any help appreciated
>
> Saif
>
>
>

Reply via email to