See https://issues.apache.org/jira/browse/ZEPPELIN-324 which was just merged. We switched the Karma JS unit tests to port 9002 since it is less commonly used than the previous default of 8080.
If you're getting this error, can you check that you don't have another service already using port 9002? On Thu, Oct 1, 2015 at 3:09 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote: > Ignoring this exception. As graphs are being rendered. > > On Thu, Oct 1, 2015 at 11:47 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> > wrote: > >> I have a spark 1.4.1 and Zeppelin from GitHub built using >> >> *mvn clean package -Pspark-1.4 -Dspark.version=1.4.1 >> -Dhadoop.version=2.7.0 -Phadoop-2.6 -Pyarn -DskipTests* >> >> and i see this in logs >> >> BlockManagerId(2, datanode-2-9429.phx01.dev.ebayc3.com, 60713) >> >> INFO [2015-10-01 11:43:50,967] >> ({sparkDriver-akka.actor.default-dispatcher-18} Logging.scala[logInfo]:59) >> - Registering block manager datanode-1-8428.phx01.dev.ebayc3.com:36087 >> with 265.1 MB RAM, BlockManagerId(1, datanode-1-8428.phx01.dev.ebayc3.com, >> 36087) >> >> INFO [2015-10-01 11:43:51,062] ({pool-2-thread-3} >> Logging.scala[logInfo]:59) - Initializing execution hive, version 0.13.1 >> >> INFO [2015-10-01 11:43:51,498] ({pool-2-thread-3} >> HiveMetaStoreClient.java[open]:297) - Trying to connect to metastore with >> URI thrift://hive-metastore-8611.phx01.dev.ebayc3.com:9083 >> >> INFO [2015-10-01 11:43:51,954] ({pool-2-thread-3} >> HiveMetaStoreClient.java[open]:385) - Connected to metastore. >> >> WARN [2015-10-01 11:43:52,025] ({pool-2-thread-3} >> SparkInterpreter.java[getSQLContext]:216) - Can't create HiveContext. >> Fallback to SQLContext >> >> java.lang.reflect.InvocationTargetException >> >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) >> >> at >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) >> >> at >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >> >> at java.lang.reflect.Constructor.newInstance(Constructor.java:526) >> >> at >> org.apache.zeppelin.spark.SparkInterpreter.getSQLContext(SparkInterpreter.java:211) >> >> at >> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:476) >> >> at >> org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74) >> >> at >> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68) >> >> at >> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92) >> >> at >> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276) >> >> at org.apache.zeppelin.scheduler.Job.run(Job.java:170) >> >> at >> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118) >> >> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) >> >> at java.util.concurrent.FutureTask.run(FutureTask.java:262) >> >> at >> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) >> >> at >> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) >> >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) >> >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) >> >> at java.lang.Thread.run(Thread.java:745) >> >> Caused by: java.lang.NoClassDefFoundError: >> org/apache/tez/dag/api/SessionNotRunning >> >> at >> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:353) >> >> at >> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:116) >> >> at >> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163) >> >> at >> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161) >> >> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168) >> >> ... 19 more >> >> Caused by: java.lang.ClassNotFoundException: >> org.apache.tez.dag.api.SessionNotRunning >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366) >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355) >> >> at java.security.AccessController.doPrivileged(Native Method) >> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:354) >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425) >> >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358) >> >> ... 24 more >> >> INFO [2015-10-01 11:43:59,552] ({pool-2-thread-3} >> Logging.scala[logInfo]:59) - ensureFreeSpace(211776) called with curMem=0, >> maxMem=515553361 >> >> INFO [2015-10-01 11:43:59,555] ({pool-2-thread-3} >> Logging.scala[logInfo]:59) - Block broadcast_0 stored as values in memory >> (estimated size 206.8 KB, free 491.5 MB) >> >> *Hadoop Version : 2.7.x* >> *Spark: 1.4.1* >> >> Hadoop installation is done using Ambari. >> >> -- >> Deepak >> >> > > > -- > Deepak > >