Everything worked fine on Spark 1.1.0 until we upgrade to 1.1.1. For some of 
our unit tests we saw the following exceptions. Any idea how to solve it? 
Thanks!
java.lang.NoClassDefFoundError: io/netty/util/TimerTask        at 
org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:72)        at 
org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)        at 
org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)        at 
org.apache.spark.SparkContext.<init>(SparkContext.scala:204)        at 
spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)
        at 
spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:255)
        at 
spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:104)
        at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
        at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
        at 
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25).......

Reply via email to