As Jerry said, this is not related to shuffle file consolidation.

The unique thing about this problem is that it's failing to find a file
while trying to _write_ to it, in append mode. The simplest explanation for
this would be that the file is deleted in between some check for existence
and opening the file for append.

The deletion of such files as a race condition with writing them (on the
map side) would be most easily explained by a JVM shutdown event, for
instance caused by a fatal error such as OutOfMemoryError. So, as Ilya
said, please look for another exception possibly preceding this one.

On Sat, Jan 10, 2015 at 12:16 PM, lucio raimondo <luxmea...@hotmail.com>
wrote:

> Hey,
>
> I am having a "similar" issue, did you manage to find a solution yet?
> Please
> check my post below for reference:
>
>
> http://apache-spark-user-list.1001560.n3.nabble.com/IOError-Errno-2-No-such-file-or-directory-tmp-spark-9e23f17e-2e23-4c26-9621-3cb4d8b832da-tmp3i3xno-td21076.html
>
> Thank you,
> Lucio
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/FileNotFoundException-in-appcache-shuffle-files-tp17605p21077.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to