Thank you for the response, Dean. There are 2 worker nodes, with 8 cores
total, attached to the stream. I have the following settings applied:

spark.executor.memory 21475m
spark.cores.max 16
spark.driver.memory 5235m


On Thu, Apr 2, 2015 at 11:50 AM, Dean Wampler <deanwamp...@gmail.com> wrote:

> Are you allocating 1 core per input stream plus additional cores for the
> rest of the processing? Each input stream Reader requires a dedicated core.
> So, if you have two input streams, you'll need "local[3]" at least.
>
> Dean Wampler, Ph.D.
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
> Typesafe <http://typesafe.com>
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
>
> On Thu, Apr 2, 2015 at 11:45 AM, byoung <bill.yo...@threatstack.com>
> wrote:
>
>> I am running a spark streaming stand-alone cluster, connected to rabbitmq
>> endpoint(s). The application will run for 20-30 minutes before failing
>> with
>> the following error:
>>
>> WARN 2015-04-01 21:00:53,944
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 22 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,944
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 23 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,951
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 20 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,951
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 19 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,952
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 18 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,952
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 17 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:53,952
>> org.apache.spark.storage.BlockManagerMaster.logWarning.71: Failed to
>> remove
>> RDD 16 - Ask timed out on
>> [Actor[akka.tcp://
>> sparkExecutor@10.1.242.221:43018/user/BlockManagerActor1#-1913092216]]
>> after [30000 ms]}
>> WARN 2015-04-01 21:00:54,151
>> org.apache.spark.streaming.scheduler.ReceiverTracker.logWarning.71: Error
>> reported by receiver for stream 0: Error in block pushing thread -
>> java.util.concurrent.TimeoutException: Futures timed out after [30
>> seconds]
>>     at
>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>>     at
>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>     at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>>     at
>>
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>     at scala.concurrent.Await$.result(package.scala:107)
>>     at
>>
>> org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:166)
>>     at
>>
>> org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:127)
>>     at
>>
>> org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:112)
>>     at
>>
>> org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:182)
>>     at
>> org.apache.spark.streaming.receiver.BlockGenerator.org
>> $apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:155)
>>     at
>>
>> org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:87)
>>
>>
>> Has anyone run into this before?
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Error-in-block-pushing-thread-tp22356.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


-- 
--
Bill Young
Threat Stack | Senior Infrastructure Engineer
http://www.threatstack.com

Reply via email to