Hi Richard, Is it possible for you to upgrade to 0.6.2 to 0.7.1 ?
0.6.0 has some critical issue Richard Xin <richardxin...@yahoo.com>于2017年6月15日周四 上午5:02写道: > and I also saw stack overflow issue: > Caused by: java.lang.StackOverflowError > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > at > scala.tools.nsc.transform.LambdaLift$$anon$1.apply(LambdaLift.scala:30) > at > scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:110) > > > On Wednesday, June 14, 2017, 12:38:30 PM PDT, Richard Xin < > richardxin...@yahoo.com> wrote: > > > it happened several times already, worked again after restart Zeppelin > > I see consistently similar error when died > > ERROR [2017-06-14 17:59:59,705] ({pool-2-thread-2} > SparkInterpreter.java[putLatestVarInResourcePool]:1253) - > java.lang.NullPointerException > at > scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) > at > org.apache.zeppelin.spark.SparkInterpreter.getLastObject(SparkInterpreter.java:1114) > at > org.apache.zeppelin.spark.SparkInterpreter.putLatestVarInResourcePool(SparkInterpreter.java:1249) > at > org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:1232) > at > org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1144) > at > org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1137) > at > org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:95) > > > at > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:490) > at org.apache.zeppelin.scheduler.Job.run(Job.java:175) > at > org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > > >