Hi,
My goal is to get Zeppelin 0.60 working with a remote Spark 1.6.1 and
Cassandra 3.4.
The connection between Zeppelin and Spark works. Currently I'm stuck on a
Guava error, more specifically in the connection between Spark and
Cassandra:
Caused by: java.lang.IllegalStateException: Detected Gu
It's not a configuration error but a well known conflict between guava 12
in Spark and guava 16 in spark cassandra driver. You can find some
workarounds in spark cassandra mailing list
My workaround in zeppelin is to load in zeppelin dependency loader (spark
interpreter config web page) the guava
Easy work-around, in $ZEPPELIN-HOME/interpreter/cassandra/lib folder, add
the guava-16.0.1.jar file and it's done.
On Wed, Apr 13, 2016 at 1:37 PM, vincent gromakowski <
vincent.gromakow...@gmail.com> wrote:
> It's not a configuration error but a well known conflict between guava 12
> in Spark an
Rocking! Vincents suggestion worked.
I tried a %dep in the notebook first, this did not work.
The $ZEPPELIN-HOME/interpreter/cassandra does not have a lib folder, but is
filled with jars itself, oa. guava-16.0.1.jar. No changes necessary there
it seems.
On Wed, Apr 13, 2016 at 1:37 PM, vincent g
Ahh yes, forgot that you're using the 0.6.0 build. The guava jar was
missing in the 0.5.5 release
On Wed, Apr 13, 2016 at 2:03 PM, Sanne de Roever
wrote:
> Rocking! Vincents suggestion worked.
>
> I tried a %dep in the notebook first, this did not work.
>
> The $ZEPPELIN-HOME/interpreter/cassand
Hi,
I have been struggling with R interpreter / SparkR interpreter. Is below
the right command to build zeppelin with R interpreter / SparkR interpreter?
mvn clean package -Pspark-1.6 -Phadoop-2.6 -Pyarn -Ppyspark -Psparkr
BR,
Patcharee
Hi,
When I ran R notebook example, I got these errors in the logs:
- Caused by: org.apache.zeppelin.interpreter.InterpreterException:
sparkr is not responding
- Caused by: org.apache.thrift.transport.TTransportException
I did not config SPARK_HOME so far, and intended to use the embedded
sp
Is this a specific Docker decision or a Zeppelin on Docker decision. I am
curious on the amount of network traffic Zeppelin actually generates. I
could be around, but I made the assumption that most of the network traffic
with Zeppelin is results from the various endpoints (Spark, JDBC, Elastic
Sea
It's a global decision on our SMACK stack platform but maybe we will go
for applications only on docker for devops (client of spark). For zeppelin
I dont see the need (no devops)
Le 13 avr. 2016 4:05 PM, "John Omernik" a écrit :
> Is this a specific Docker decision or a Zeppelin on Docker decisi
Can you post the full stacktrace you have (look also at the log file)?
Did you install R on your machine?
SPARK_HOME is optional.
On 13/04/16 15:39, Patcharee Thongtra wrote:
Hi,
When I ran R notebook example, I got these errors in the logs:
- Caused by: org.apache.zeppelin.interpreter.Inter
Hi,
I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also has the
same symptoms) on a new CDH cluster running Hadoop 2.6.0-cdh5.7.0 and
Spark 1.6.0, but I'm getting this error when I use SPARK_HOME to point
to the /opt/cloudera/parcels/CDH/lib/spark directory in zeppelin-env.sh:
jav
hi Scott
Vendor-repo would be the way to go. It is possible in this case CDH Spark 1.6
has some incompatible API changes, though I couldn't find it yet. Do you have
more from the logs on that NoSuchMethodException?
_
From: Scott Zelenka
Sent: Wednesday, April 13,
Indeed, one of the notebooks JSON files was corrupt... not sure how it
happened, but it wasn't an important one so I just deleted it.
Thanks!!
--
Chris Miller
On Mon, Apr 11, 2016 at 9:06 PM, Hyung Sung Shim wrote:
> Hello.
> It seems like one of your notebook json file has problem.
> Could y
13 matches
Mail list logo