All:
Just trying to get 0.9.0 to work and running into all sorts of issues.
Previously I had set SPARK_MASTER to be yarn-client so it would use my
existing yarn cluster.
That threw an error about yarn-client being deprecated in 2.0.
So I switched it to local.
I now get the error about the interpreter not starting and the following
output in the note:
org.apache.zeppelin.interpreter.InterpreterException:
java.io.IOException: Fail to launch interpreter process: Interpreter
launch command: /opt/spark/spark-current/bin/spark-submit --class
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
--driver-class-path
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
--driver-java-options " -Dfile.encoding=UTF-8
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties'
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'"
--driver-memory 4G --executor-memory 6G --conf
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\]
--conf spark\.sql\.crossJoin\.enabled\=true --conf
spark\.cores\.max\=10
/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar
10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains
multiple SLF4J bindings. SLF4J: Found binding in
[jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
Found binding in
[jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation. SLF4J: Actual binding is of type
[org.slf4j.impl.Log4jLoggerFactory] at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:134)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:281)
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:412)
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:72) at
org.apache.zeppelin.scheduler.Job.run(Job.java:172) at
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
at
org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:180)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by:
java.io.IOException: Fail to launch interpreter process: Interpreter
launch command: /opt/spark/spark-current/bin/spark-submit --class
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
--driver-class-path
":/opt/zeppelin/zeppelin-current/interpreter/spark/*::/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/hadoop/hadoop-current/share/hadoop/common/sources/:/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT-shaded.jar
/opt/zeppelin/zeppelin-current/interpreter/zeppelin-interpreter-shaded-0.9.0-SNAPSHOT.jar:/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar:/opt/hadoop/hadoop-current/etc/hadoop"
--driver-java-options " -Dfile.encoding=UTF-8
-Dlog4j.configuration='file:///opt/zeppelin/zeppelin-current/conf/log4j.properties'
-Dlog4j.configurationFile='file:///opt/zeppelin/zeppelin-current/conf/log4j2.properties'
-Dzeppelin.log.file='/opt/zeppelin/zeppelin-current/logs/zeppelin-interpreter-spark-dspc_demo-zeppelin-dspcnode11.dspc.incadencecorp.com.log'"
--driver-memory 4G --executor-memory 6G --conf
spark\.serializer\=org\.apache\.spark\.serializer\.KryoSerializer
--conf spark\.executor\.memory\=1G --conf spark\.app\.name\=Zeppelin
--conf spark\.executor\.instances\=5 --conf spark\.master\=local\[\*\]
--conf spark\.sql\.crossJoin\.enabled\=true --conf
spark\.cores\.max\=10
/opt/zeppelin/zeppelin-current/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar
10.1.50.111 33591 "spark-dspc_demo" : SLF4J: Class path contains
multiple SLF4J bindings. SLF4J: Found binding in
[jar:file:/opt/zeppelin/zeppelin-0.9.0-SNAPSHOT/interpreter/spark/spark-interpreter-0.9.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
Found binding in
[jar:file:/opt/spark/spark-2.4.3.bdp-1-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation. SLF4J: Actual binding is of type
[org.slf4j.impl.Log4jLoggerFactory] at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterManagedProcess.start(RemoteInterpreterManagedProcess.java:126)
at
org.apache.zeppelin.interpreter.ManagedInterpreterGroup.getOrCreateInterpreterProcess(ManagedInterpreterGroup.java:67)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getOrCreateInterpreterProcess(RemoteInterpreter.java:110)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.internal_create(RemoteInterpreter.java:160)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:131)
... 13 more
This all worked with my 0.8.2 build. Am I missing something?
Note, I have to use a custom build as we have a PKI based login
mechanism through nginx that
passes authentication/authorization tokens in the header. So I can't
use the out of the box build.
--
========= mailto:[email protected] ============
David W. Boyd
VP, Data Solutions
10432 Balls Ford, Suite 240
Manassas, VA 20109
office: +1-703-552-2862
cell: +1-703-402-7908
============== http://www.incadencecorp.com/ ============
ISO/IEC JTC1 SC42/WG2, editor ISO/IEC 20546, ISO/IEC 20547-1
Chair INCITS TG Big Data
Co-chair NIST Big Data Public Working Group Reference Architecture
First Robotic Mentor - FRC, FTC - www.iliterobotics.org
Board Member- USSTEM Foundation - www.usstem.org
The information contained in this message may be privileged
and/or confidential and protected from disclosure.
If the reader of this message is not the intended recipient
or an employee or agent responsible for delivering this message
to the intended recipient, you are hereby notified that any
dissemination, distribution or copying of this communication
is strictly prohibited. If you have received this communication
in error, please notify the sender immediately by replying to
this message and deleting the material from any computer.