I'm using Zeppelin with DSE 4.7, 0 issue. Idea, try to build zeppelin WITH: mvn clean package -Pcassandra-spark-1.1 -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
And retest. Your issue is purely jar version mismatch On Wed, Jun 24, 2015 at 4:22 PM, Gabriel Ciuloaica <gciuloa...@gmail.com> wrote: > Hi, > > I have spent most of the day today trying to make Zeppelin to connect to > an existing spark master. I’m using Datastax solution. > I have done a custom build with spark 1.1.0 and hadoop 1.0.4. It did not > worked, got connection issues to spark master and the stack trace shows > thrift related exceptions. Telnet on spark master node on 7077 is working > just fine. I’m using this spark cluster in production for some time. > I also have tried to use the spark jars from DSE distribution that seems > being customised by Datastax… those are having version 1.1.0.4. I was not > able to compile Zeppelin with this version (compilation errors). > After that, I have recompiled with spark 1.1.1 but this time I’m getting : > > 5/06/24 17:05:19 ERROR thrift.ProcessFunction: Internal error processing > getProgress > org.apache.zeppelin.interpreter.InterpreterException: > org.apache.zeppelin.interpreter.InterpreterException: > akka.ConfigurationException: Akka JAR version [2.3.4] does not match the > provided config version [2.2.3] > at > org.apache.zeppelin.interpreter.ClassloaderInterpreter.getProgress(ClassloaderInterpreter.java:131) > at > org.apache.zeppelin.interpreter.LazyOpenInterpreter.getProgress(LazyOpenInterpreter.java:110) > at > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.getProgress(RemoteInterpreterServer.java:299) > at > org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:938) > at > org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$getProgress.getResult(RemoteInterpreterService.java:923) > at > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:744) > Caused by: org.apache.zeppelin.interpreter.InterpreterException: > akka.ConfigurationException: Akka JAR version [2.3.4] does not match the > provided config version [2.2.3] > at > org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:75) > at > org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) > at > org.apache.zeppelin.spark.SparkSqlInterpreter.getSparkInterpreter(SparkSqlInterpreter.java:101) > at > org.apache.zeppelin.spark.SparkSqlInterpreter.getProgress(SparkSqlInterpreter.java:157) > at > org.apache.zeppelin.interpreter.ClassloaderInterpreter.getProgress(ClassloaderInterpreter.java:129) > ... 10 more > Caused by: akka.ConfigurationException: Akka JAR version [2.3.4] does not > match the provided config version [2.2.3] > at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:209) > at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) > at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) > at > org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) > at > org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) > at > org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) > at > org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1765) > at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) > at > org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1756) > at > org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56) > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:222) > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:240) > at > org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:276) > at > org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:149) > at > org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:398) > at > org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:73) > ... 14 more > > Does anybody succeeded to use Zeppelin with DSE? > > Thanks, > -- > Gabriel Ciuloaica > >