Yes.

Add com.databricks:spark-avro_2.11:3.2.0 as a dependency to a spark interpreter 
then try using that interpreter to run any code in a notebook to reproduce.

For our internal library that was causing it I went through and tried excluding 
everything that that library imported. It turns out it was importing 
org.scala-lang:scala-library. Excluding that may have resolved the issue for 
our internal package.



PAUL BRENNER
Sr. Data Scientist
pbren...@placeiq.com | (217) 390-3033 | www.placeiq.com
twitter @placeiq linkedin /placeiq
On Apr 20, 2019, 3:30 AM -0400, Jeff Zhang <zjf...@gmail.com>, wrote:
> Hi Paul,
>
> Could you describe how to reproduce it ?
>
> > Paul Brenner <pbren...@placeiq.com> 于2019年4月20日周六 上午12:30写道:
> > > I’m trying to move my company up from zeppelin 0.8 to 0.81 or 0.82. 
> > > However, we find that when including certain dependencies on the newer 
> > > versions of zeppelin spark interpreters won’t start and instead throw the 
> > > error I’ll paste below. This is if we have zeppelin.spark.useNew set to 
> > > true. Setting it to false just gives a null pointer exception instead. An 
> > > example of a dependency that causes this is 
> > > com.databricks:spark-avro_2.11:3.2.0, but there are also some internal 
> > > packages we use that we haven’t been able to find anything that should be 
> > > causing this conflict.
> > >
> > > I found someone with a similar problem on ZEPPELIN-4074 but no one seems 
> > > to have any idea what to do. Any ideas? It sounds like isJavaAtLeast got 
> > > fixed to work with JDK 9 in scala 12 (see here), but who invited JDK 9 to 
> > > this party? I certainly didn’t.
> > >
> > > Zeppelin 0.8 is really struggling for us so we are desperate to upgrade 
> > > but this is holding us back, would really appreciate any help at all!
> > >
> > > java.lang.NumberFormatException: Not a version: 9 at 
> > > scala.util.PropertiesTrait$class.parts$1(Properties.scala:184) at 
> > > scala.util.PropertiesTrait$class.isJavaAtLeast(Properties.scala:187) at 
> > > scala.util.Properties$.isJavaAtLeast(Properties.scala:17) at 
> > > scala.tools.util.PathResolverBase$Calculated$.javaBootClasspath(PathResolver.scala:276)
> > >  at 
> > > scala.tools.util.PathResolverBase$Calculated$.basis(PathResolver.scala:283)
> > >  at 
> > > scala.tools.util.PathResolverBase$Calculated$.containers$lzycompute(PathResolver.scala:293)
> > >  at 
> > > scala.tools.util.PathResolverBase$Calculated$.containers(PathResolver.scala:293)
> > >  at scala.tools.util.PathResolverBase.containers(PathResolver.scala:309) 
> > > at scala.tools.util.PathResolver.computeResult(PathResolver.scala:341) at 
> > > scala.tools.util.PathResolver.computeResult(PathResolver.scala:332) at 
> > > scala.tools.util.PathResolverBase.result(PathResolver.scala:314) at 
> > > scala.tools.nsc.backend.JavaPlatform$class.classPath(JavaPlatform.scala:28)
> > >  at scala.tools.nsc.Global$GlobalPlatform.classPath(Global.scala:115) at 
> > > scala.tools.nsc.Global.scala$tools$nsc$Global$$recursiveClassPath(Global.scala:131)
> > >  at scala.tools.nsc.Global$GlobalMirror.rootLoader(Global.scala:64) at 
> > > scala.reflect.internal.Mirrors$Roots$RootClass.<init>(Mirrors.scala:307) 
> > > at 
> > > scala.reflect.internal.Mirrors$Roots.RootClass$lzycompute(Mirrors.scala:321)
> > >  at scala.reflect.internal.Mirrors$Roots.RootClass(Mirrors.scala:321) at 
> > > scala.reflect.internal.Mirrors$Roots$EmptyPackageClass.<init>(Mirrors.scala:330)
> > >  at 
> > > scala.reflect.internal.Mirrors$Roots.EmptyPackageClass$lzycompute(Mirrors.scala:336)
> > >  at 
> > > scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:336) 
> > > at 
> > > scala.reflect.internal.Mirrors$Roots.EmptyPackageClass(Mirrors.scala:276) 
> > > at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:250) at 
> > > scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:73) at 
> > > scala.tools.nsc.Global.rootMirror(Global.scala:71) at 
> > > scala.tools.nsc.Global.rootMirror(Global.scala:39) at 
> > > scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
> > >  at 
> > > scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
> > >  at 
> > > scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1390)
> > >  at scala.tools.nsc.Global$Run.<init>(Global.scala:1242) at 
> > > scala.tools.nsc.interpreter.IMain.scala$tools$nsc$interpreter$IMain$$_initialize(IMain.scala:139)
> > >  at 
> > > scala.tools.nsc.interpreter.IMain.initializeSynchronous(IMain.scala:161) 
> > > at 
> > > org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:85)
> > >  at 
> > > org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
> > >  at 
> > > org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62) 
> > > at 
> > > org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
> > >  at 
> > > org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
> > >  at org.apache.zeppelin.scheduler.Job.run(Job.java:188) at 
> > > org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140) 
> > > at 
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
> > > at java.util.concurrent.FutureTask.run(FutureTask.java:266) at 
> > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> > >  at 
> > > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> > >  at 
> > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> > >  at 
> > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> > >  at java.lang.Thread.run(Thread.java:748)
> > >
> > >
> > >
> > > PAUL BRENNER
> > > Sr. Data Scientist
> > > pbren...@placeiq.com | (217) 390-3033 | www.placeiq.com
> > > twitter @placeiq linkedin /placeiq
>
>
> --
> Best Regards
>
> Jeff Zhang

Reply via email to