Ahh yes, forgot that you're using the 0.6.0 build. The guava jar was missing in the 0.5.5 release
On Wed, Apr 13, 2016 at 2:03 PM, Sanne de Roever <sanne.de.roe...@gmail.com> wrote: > Rocking! Vincents suggestion worked. > > I tried a %dep in the notebook first, this did not work. > > The $ZEPPELIN-HOME/interpreter/cassandra does not have a lib folder, but > is filled with jars itself, oa. guava-16.0.1.jar. No changes necessary > there it seems. > > On Wed, Apr 13, 2016 at 1:37 PM, vincent gromakowski < > vincent.gromakow...@gmail.com> wrote: > >> It's not a configuration error but a well known conflict between guava 12 >> in Spark and guava 16 in spark cassandra driver. You can find some >> workarounds in spark cassandra mailing list >> >> My workaround in zeppelin is to load in zeppelin dependency loader (spark >> interpreter config web page) the guava 16 lib. It's a big conflict that >> will probably be resolved in Spark 2.0 >> >> 2016-04-13 13:32 GMT+02:00 Sanne de Roever <sanne.de.roe...@gmail.com>: >> >>> Hi, >>> >>> My goal is to get Zeppelin 0.60 working with a remote Spark 1.6.1 and >>> Cassandra 3.4. >>> >>> The connection between Zeppelin and Spark works. Currently I'm stuck on >>> a Guava error, more specifically in the connection between Spark and >>> Cassandra: >>> >>> Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 >>> which indicates that a version of Guava less than 16.01 is in use. This >>> introduces codec resolution issues and potentially other incompatibility >>> issues in the driver. Please upgrade to Guava 16.01 or later. >>> at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62) >>> at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36) >>> at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67) >>> >>> A related issue has appeared earlier in Zeppelin: >>> https://issues.apache.org/jira/browse/ZEPPELIN-620 >>> >>> I'm configuring the Cassandra driver by setting the spark.jars property >>> in spark-defaults.conf: >>> >>> spark.jars >>> /u01/app/zeppelin/spark-cassandra-libs/spark-core_2.10-1.6.1.jar,/u01/app/zeppelin/spark-cassandra-libs/joda-convert-1.8.1.jar,/u01/app/zeppelin/spark-cassandra-libs/cassandra-thrift-3.4.jar,/u01/app/zeppelin/spark-cassandra-libs/joda-time-2.9.3.jar,/u01/app/zeppelin/spark-cassandra-libs/spark-cassandra-connector-java_2.10-1.6.0-M1.jar,/u01/app/zeppelin/spark-cassandra-libs/spark-cassandra-connector-1.6.0-M1-s_2.10.jar,/u01/app/zeppelin/spark-cassandra-libs/guava-19.0.jar,/u01/app/zeppelin/spark-cassandra-libs/cassandra-driver-core-3.0.0.jar >>> >>> (There are no external connections in the data center) >>> >>> Is this a configuration error? >>> >>> Cheers, >>> >>> Sanne >>> >> >> >