Hi, I am testing kafka-spark streaming application which throws below error after few seconds and below configuration is used for spark streaming test environment.
kafka version- 0.8.1 spark version- 1.0.1 SPARK_MASTER_MEMORY="1G" SPARK_DRIVER_MEMORY="1G" SPARK_WORKER_INSTANCES="1" SPARK_EXECUTOR_INSTANCES="1" SPARK_WORKER_MEMORY="1G" SPARK_EXECUTOR_MEMORY="1G" SPARK_WORKER_CORES="2" SPARK_EXECUTOR_CORES="1" *ERROR:* 14/09/12 17:30:23 WARN TaskSetManager: Loss was due to java.lang.Exception java.lang.Exception: *Could not compute split, block* *input-4-1410542878200 not found* at org.apache.spark.rdd.BlockRDD.compute(BlockRDD.scala:51) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.rdd.UnionPartition.iterator(UnionRDD.scala:33) at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:74) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:77) at org.apache.spark.rdd.RDD.iterator(RDD.scala:227) at org.apache.spark.rdd.MapPartitionsRDD.compute( MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262) at org.apache.spark.rdd.RDD.iterator(RDD.scala:229) at org.apache.spark.scheduler.ShuffleMapTask.runTask( ShuffleMapTask.scala:158) at org.apache.spark.scheduler.ShuffleMapTask.runTask( ShuffleMapTask.scala:99) at org.apache.spark.scheduler.Task.run(Task.scala:51) at org.apache.spark.executor.Executor$TaskRunner.run( Executor.scala:187) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Please suggest your answer Regards, Rafeeq S *(“What you do is what matters, not what you think or say or plan.” )*