Run Spark with --verbose flag, to see what it read for that path.

I guess in Windows if you are using backslash, you need 2 of them (\\), or
just use forward slashes everywhere.

On Fri, Jul 17, 2015 at 2:40 PM, Julien Beaudan <jbeau...@stottlerhenke.com>
wrote:

> Hi,
>
> I running a stand-alone cluster in Windows 7, and when I try to run any
> worker on the machine, I get the following error:
>
> 15/07/17 14:14:43 ERROR ExecutorRunner: Error running executor
> java.io.IOException: Cannot run program
> "C:\cygdrive\c\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4/bin/compute-classpath.cmd"
> (in directory "."): CreateProcess error=2, The system cannot find the file
> specified
>         at java.lang.ProcessBuilder.start(Unknown Source)
>         at org.apache.spark.util.Utils$.executeCommand(Utils.scala:1067)
>         at
> org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:1084)
>         at
> org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:112)
>         at
> org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>         at
> org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:47)
>         at
> org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:132)
>         at
> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:68)
> Caused by: java.io.IOException: CreateProcess error=2, The system cannot
> find the file specified
>         at java.lang.ProcessImpl.create(Native Method)
>         at java.lang.ProcessImpl.<init>(Unknown Source)
>         at java.lang.ProcessImpl.start(Unknown Source)
>         ... 8 more
>
>
> I'm pretty sure the problem is that Spark is looking for the following
> path, which mixes forward and back slashes:
>
>
> C:\cygdrive\c\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4/bin/compute-classpath.cmd
>
> Is there anyway to fix this?
>
> (Also, I have also tried running this from a normal terminal, instead of
> from cygwin, and I get the same issue, except this time the path is:
> C:\Users\jbeaudan\Spark\spark-1.3.1-bin-hadoop2.4\bin../bin/compute-classpath.cmd
> )
>
> Thank you!
>
> Julien
>
>
>
>


-- 

Best regards,
Elkhan Dadashov

Reply via email to