Hi,

I sometimes get these Random init failures in test and prod. Is there a use
case that could lead to these errors?
For example: Not enough cores? driver and worker not on the same LAN? etc...

Running Spark 1.5.1. Retrying solves it.

Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[10000 milliseconds]
        at
scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at akka.remote.Remoting.start(Remoting.scala:179)
        at
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
        at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
        at
akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
        at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
        at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
        at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
        at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
        at
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
        at
org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253)
        at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
        at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:450)
        at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
        at ...

Reply via email to