Hi Stefano,
Can you tell me which spark you are using?
If you have set cdh spark as SPARK_HOME, could you confirm that you
installed spark-pyspark package?
Let me know I
On Wed, Sep 28, 2016 at 12:06 AM Stefano Ortolani
wrote:
> Hi,
>
> I have been using Zeppelin for quite a while without issu
Mich, thanks for the suggestion. I tried your settings, but they did not
solve the problem.
I'm running in yarn-client mode, not local or standalone, so the resources
in the Spark cluster (which is very large) should not be an issue. Right?
The problem seems to be that Zeppelin is not submitting
Hi Mark,
you may want to check the spark interpreter settings. In the most recent
version of zeppelin you can set it to shared, isolated or scoped.
Shared: single interpreter and spark context (and the queuing you see)
Isolated: every notebook has its own interpreter and spark context
Scoped: eve
That was it! Thanks so much Andreas. Can't believe I had overlooked that
drop down in the interpreter settings. Mohit and Mich probably assumed I
had tried that already.
Thanks everyone.
Mark
On Thu, Oct 6, 2016 at 8:35 AM, Andreas Lang wrote:
> Hi Mark,
>
> you may want to check the spark int
Hello again,
On "longer" running jobs (I'm using yarn-client mode), I sometimes get RPC
timeouts. Seems like Zeppelin is losing connectivity with the Spark
cluster. I can deal with that.
But my notebook has sections stuck in the "Cancel" state, and I can't get
them out. When I re-click on cancel,
Actually, it's stuck in the Running state. Trying to cancel it causes the
No active SparkContext to appear in the log. Seems like a bug.
On Thu, Oct 6, 2016 at 9:06 AM, Mark Libucha wrote:
> Hello again,
>
> On "longer" running jobs (I'm using yarn-client mode), I sometimes get RPC
> timeouts. S
I get this error while running simple spark sql query. I only get it when
running entire notebook at once. If I run each paragraph one by one it
disappears!
Caused by: java.lang.NoSuchFieldException: MODULE$
at java.lang.Class.getField(Class.java:1703)
at scala.Enumeration.readResolve(Enumeration.
I forgot to mention other scenario where I get same error. It's during use
of `org.apache.spark.mllib.evaluation.MulticlassMetrics`
It fails on following call
val predictionMatrix = new MulticlassMetrics(preditctionAndLabels)
predictionMatrix.fMeasure(3.0) //Fails here
ERROR [2016-10-06 11:58:5
Hi community,
When I build ggplots and display them within Zeppelin, they always display
at 504x504 pixels. When I increase the paragraph width, it just stretches
that image.
I've tried adding {"imageWidth": "1000px"} but that didn't actually help.
Any advice?
Thanks,
Kevin