Hi, we are using spark 1.1.0 streaming and we are hitting this same issue.

Basically from the job output I saw the following things happen in sequence.

 948 14/10/07 18:09:59 INFO storage.BlockManagerInfo: Added
input-0-1412705397200 in memory on ip-10-4-62-85.ec2.internal:59230 (size:
5.3 MB, free: 443.2 MB)
...
1269 14/10/07 18:10:26 INFO storage.BlockManagerInfo: Removed
input-0-1412705397200 on ip-10-4-62-85.ec2.internal:59230 in memory (size:
5.3 MB, free: 15.6 MB)
...
1277 14/10/07 18:10:27 WARN scheduler.TaskSetManager: Lost task 45.0 in
stage 12.0 (TID 129, domU-12-31-39-04-60-07.compute-1.internal):
java.lang.Exception: Could not compute split, block input-0-1412705397200
not found
1278         org.apache.spark.rdd.BlockRDD.compute(BlockRDD.scala:51)
1279         org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
1280         org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
1281         org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
1282         org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
1283         org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
1284        
org.apache.spark.rdd.FlatMappedRDD.compute(FlatMappedRDD.scala:33)
1285         org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
1286         org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
1287        
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
1288        
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
1289         org.apache.spark.scheduler.Task.run(Task.scala:54)
1290        
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
1291        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
1292        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
1293         java.lang.Thread.run(Thread.java:745)
1294 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Starting task 45.1 in
stage 12.0 (TID 130, domU-12-31-39-04-60-07.compute-1.internal, RACK_LOCAL,
1146 bytes)
1295 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Lost task 45.1 in
stage 12.0 (TID 130) on executor domU-12-31-39-04-60-07.compute-1.internal:
java.lang.Exception (Could not compute split, block input-0-1412705397200
not found) [dupl     icate 1]
1296 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Starting task 45.2 in
stage 12.0 (TID 131, domU-12-31-39-04-60-07.compute-1.internal, RACK_LOCAL,
1146 bytes)
1297 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Lost task 45.2 in
stage 12.0 (TID 131) on executor domU-12-31-39-04-60-07.compute-1.internal:
java.lang.Exception (Could not compute split, block input-0-1412705397200
not found) [dupl     icate 2]
1298 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Starting task 45.3 in
stage 12.0 (TID 132, domU-12-31-39-04-60-07.compute-1.internal, RACK_LOCAL,
1146 bytes)
1299 14/10/07 18:10:27 INFO scheduler.TaskSetManager: Lost task 45.3 in
stage 12.0 (TID 132) on executor domU-12-31-39-04-60-07.compute-1.internal:
java.lang.Exception (Could not compute split, block input-0-1412705397200
not found) [dupl     icate 3]
1300 14/10/07 18:10:27 ERROR scheduler.TaskSetManager: Task 45 in stage 12.0
failed 4 times; aborting job





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Could-not-compute-split-block-not-found-tp11186p15899.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to