Hi there,

On HDP 2.4 I've installed Zeppelin 0.6.1 with Spark interpreter built with
Scala 2.10. (Spark version is 1.6.1.)

All interpreters work well but the Spark interpreter fails. The error in
log message is:

> ERROR [2016-12-07 15:57:40,512] ({pool-2-thread-2} Job.java[run]:189) -
Job failed
> java.lang.IncompatibleClassChangeError: Implementing class

(All error stack trace is here:
https://gist.github.com/vando/50bd0dbb970d0c2bd2fe13a6344109b8.)

In zeppelin-env.sh file the environment variables are

> export MASTER=yarn-client
> export HADOOP_CONF_DIR="/etc/hadoop/conf"
> export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.2.0-258
-Dspark.yarn.queue=default"
> export SPARK_HOME="/usr/hdp/current/spark-client"
> export
PYTHONPATH="${SPARK_HOME}/python:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip"
> export SPARK_YARN_USER_ENV="PYTHONPATH=${PYTHONPATH}"

Do you have any idea on how to correct this error?

Thanks in advance.

Best,

Reply via email to