Hi Michael-
I'm not sure what the values are supposed to be in order to compare. The
interpreter is running; a save and restart still gives the same result.
On 10/06/2017 10:41 AM, Michael Segel wrote:
What do you see when you check out the spark interpreter? Something
with %spark, or %spark.sql (Sorry, going from memory. ) I think it
may also have to do with not having the spark interpreter running, so
if you manually restart the interpreter then re-run the notebook… it
should work…
HTH
On Oct 6, 2017, at 9:35 AM, Terry Healy <the...@bnl.gov
<mailto:the...@bnl.gov>> wrote:
Using Zeppelin 0.7.3, Spark 2.1.0-mapr-1703 / Scala 2.11.8
I had previously run the demo and successfully set up MongoDB and
JDBC interpreter for Impala under V0.7.2. Since I have upgraded to
0.7.3, everything broke. I am down to to complete re-install
(several, in fact) and get a response like below for most everything
I try. (Focusing just on %spark for now) apparently have something
very basic wrong, but I'll be damned if I can find it. The same
example works fine in spark-shell.
Any suggestions for a new guy very much appreciated.
I found[ZEPPELIN-2475] and
<https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0ahUKEwjEtK7_jdzWAhXC4SYKHR-oAtAQFggrMAA&url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FZEPPELIN-2475&usg=AOvVaw2-wxGTzLNZgYSUQZFdGoyj>[ZEPPELIN-1560]
which seem to be the same, or similar, but I did not understand what
to change where
<https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwjEtK7_jdzWAhXC4SYKHR-oAtAQFggyMAE&url=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FZEPPELIN-1560&usg=AOvVaw1HMgQXqJ1pSufx80ablv0y>
This is from "Zeppelin Tutorial/Basic Features (Spark)".
java.lang.NullPointerException
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:398)
at
org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:387)
at
org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
at
org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:843)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)