I am bumping into this problem as well. I am trying to move to akka 2.3.x
from 2.2.x in order to port to Scala 2.11 - only akka 2.3.x is available in
Scala 2.11. All 2.2.x akka works fine, and all 2.3.x akka give the
following exception in "new SparkContext". Still investigating why..
java.util.
I was annoyed by this as well.
It appears that just permuting the order of decencies inclusion solves this
problem:
first spark, than your cdh hadoop distro.
HTH,
Pierre
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p
How did you deal with this problem finally?I also met with it.
Best regards,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
How did you deal with this problem, I have met with it these days.God bless
me.
Best regard,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5738.html
Sent from the Apache Spark User List mailing list archive at Nabble.c
I have found a workaround. If you add akka 2.2.4 to your dependencies, then
everything works, probably because akka 2.2.4 brings in newer version of
Jetty.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p2670.html
Sent f
By the way, this is the underlying error for me:
java.lang.VerifyError: (class:
org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
signature:
(Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;)
Wrong return type in function
at
akka.r
I also hit this error.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p2667.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.