Hello,

  I'm running Spark on Windows 7 as standalone, with everything on the same 
machine. No Hadoop installed. My app throws exception and worker reports:
  Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
  I had the same problem earlier when deploying local. I understand this is a 
bug<https://issues.apache.org/jira/browse/SPARK-2356> and I tried a 
workaround<http://qnalist.com/questions/4994960/run-spark-unit-test-on-windows-7>
 which worked for local deployment, but it does not work for standalone. I also 
tried setting the Hadoop home directory via SPARK_DEMON_JAVA_OPTS and restarted 
everything, but no change.

  Any idea how to cure this by setting Java properties or otherwise? Thanks!

     Best, Oliver

Oliver Ruebenacker | Solutions Architect

Altisource(tm)
290 Congress St, 7th Floor | Boston, Massachusetts 02210
P: (617) 728-5582 | ext: 275585
oliver.ruebenac...@altisource.com<mailto:oliver.ruebenac...@altisource.com> | 
www.Altisource.com

***********************************************************************************************************************

This email message and any attachments are intended solely for the use of the 
addressee. If you are not the intended recipient, you are prohibited from 
reading, disclosing, reproducing, distributing, disseminating or otherwise 
using this transmission. If you have received this message in error, please 
promptly notify the sender by reply email and immediately delete this message 
from your system. This message and any attachments may contain information that 
is confidential, privileged or exempt from disclosure. Delivery of this message 
to any person other than the intended recipient is not intended to waive any 
right or privilege. Message transmission is not guaranteed to be secure or free 
of software viruses.
***********************************************************************************************************************

Reply via email to