Hi Oleg,
Did you ever figure this out? I'm observing the same exception also in
0.9.1 and think it might be related to setting spark.speculation=true. My
theory is that multiple attempts at the same task start, the first finishes
and cleans up the _temporary directory, and then the second fails
Hi All,
After a few simple transformations I am trying to save to a local file
system. The code works in local mode but not on a standalone cluster. The
directory *1.txt/_temporary* does exist after the exception.
I would appreciate any suggestions.
*scala> d3.sample(false,0.01,1).map( pair