It appears that the file paths are different when running spark in
local and cluster mode. When running spark without --master the paths to
the pipe command are relative to the local machine. When running spark
with --master the paths to the pipe command are ./

This is what finally worked. I still don't understand why. It's not
documented and a cursory glance of the source code didn't help.
sc.addFile("./ProgramSIM");
...
JavaRDD<String> output = data.pipe(SparkFiles.get("./ProgramSIM"));




On Thu, Feb 19, 2015 at 5:41 PM, Imran Rashid <iras...@cloudera.com> wrote:

> The error msg is telling you the exact problem, it can't find "ProgramSIM", 
> the thing you are trying to run
>
> Lost task 3520.3 in stage 0.0 (TID 11, compute3.research.dev): 
> java.io.IOException: Cannot run program "ProgramSIM": error=2, No s\
> uch file or directory
>
>
> On Thu, Feb 19, 2015 at 5:52 PM, athing goingon <athinggoin...@gmail.com>
> wrote:
>
>> Hi, I'm trying to figure out why the following job is failing on a pipe
>> http://pastebin.com/raw.php?i=U5E8YiNN
>>
>> With this exception:
>> http://pastebin.com/raw.php?i=07NTGyPP
>>
>> Any help is welcome. Thank you.
>>
>
>

Reply via email to