I am running FlumeEventCount program in CDH 5.0.1 which has Spark 0.9.0. The program runs fine in local process as well as standalone cluster mode. However, the program fails in YARN mode. I see the following error: INFO scheduler.DAGScheduler: Stage 2 (runJob at NetworkInputTracker.scala:182) finished in 0.215 s INFO spark.SparkContext: Job finished: runJob at NetworkInputTracker.scala:182, took 0.224696381 s ERROR scheduler.NetworkInputTracker: De-registered receiver for network stream 0 with message org.jboss.netty.channel.ChannelException: Failed to bind to: xxxxx/xx.xxx.x.xx:41415 Is there a workaround for this ?
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-FlumeInputDStream-in-spark-cluster-tp1604p17226.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org