Hey Sourav, are you able to run a simple shuffle in a spark-shell?

2014-12-05 1:20 GMT-08:00 Shao, Saisai <saisai.s...@intel.com>:

>  Hi,
>
>
>
> I don’t think it’s a problem of Spark Streaming, seeing for call stack,
> it’s the problem when BlockManager starting to initializing itself. Would
> you mind checking your configuration of Spark, hardware problem,
> deployment. Mostly I think it’s not the problem of Spark.
>
>
>
> Thanks
>
> Saisai
>
>
>
> *From:* Sourav Chandra [mailto:sourav.chan...@livestream.com]
> *Sent:* Friday, December 5, 2014 4:36 PM
> *To:* user@spark.apache.org
> *Subject:* Spark streaming for v1.1.1 - unable to start application
>
>
>
> Hi,
>
>
>
> I am getting the below error and due to this there is no completed stages-
> all the waiting
>
>
>
> *14/12/05 03:31:59 WARN AkkaUtils: Error sending message in 1 attempts*
>
> *java.util.concurrent.TimeoutException: Futures timed out after [30
> seconds]*
>
> *        at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
>
> *        at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
>
> *        at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
>
> *        at
> akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169)*
>
> *        at
> scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)*
>
> *        at
> akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167)*
>
> *        at scala.concurrent.Await$.result(package.scala:107)*
>
> *        at
> org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)*
>
> *        at
> org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:213)*
>
> *        at
> org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:203)*
>
> *        at
> org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:47)*
>
> *        at
> org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:177)*
>
> *        at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:147)*
>
> *        at
> org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)*
>
> *        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)*
>
> *        at org.apache.spark.executor.Executor.<init>(Executor.scala:78)*
>
> *        at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:60)*
>
> *        at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)*
>
> *        at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)*
>
> *        at
> scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)*
>
> *        at
> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)*
>
> *        at
> org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)*
>
> *        at
> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)*
>
> *        at
> org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)*
>
> *        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)*
>
> *        at akka.actor.ActorCell.invoke(ActorCell.scala:456)*
>
> *        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)*
>
> *        at akka.dispatch.Mailbox.run(Mailbox.scala:219)*
>
> *        at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)*
>
> *        at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)*
>
> *        at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)*
>
> *        at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)*
>
> *        at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)*
>
>
>
>
>
> Could you please let me know the reason and fix for this? Spark version is
> 1.1.1
>
>
>
> --
>
> *Sourav Chandra*
>
> Senior Software Engineer
>
> · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · ·
>
> sourav.chan...@livestream.com
>
> o: +91 80 4121 8723
>
> m: +91 988 699 3746
>
> skype: sourav.chandra
>
> *Livestream*
>
> "Ajmera Summit", First Floor, #3/D, 68 Ward, 3rd Cross, 7th C Main, 3rd
> Block, Koramangala Industrial Area,
>
> Bangalore 560034
>
> www.livestream.com
>
>
>

Reply via email to