not sure, so downloaded  again release 1.4.1 with Hadoop 2.6 and later
options from http://spark.apache.org/downloads.html assuming the version is
consistent and run the following on Windows 10

c:\spark-1.4.1-bin-hadoop2.6>bin\run-example HdfsTest <local_file>

still got similar exception below: (I heard there's permission config for
hdfs, if so how do I do that?)

15/09/29 13:03:26 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
        at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:873)
        at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:853)
        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:465)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:398)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:390)
        at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
        at org.apache.spark.executor.Executor.org
$apache$spark$executor$Executor$$updateDependencies(Executor.scala:390)
        at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:724)

On Mon, Sep 28, 2015 at 4:39 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> What version of hadoop are you using ?
>
> Is that version consistent with the one which was used to build Spark
> 1.4.0 ?
>
> Cheers
>
> On Mon, Sep 28, 2015 at 4:36 PM, Renyi Xiong <renyixio...@gmail.com>
> wrote:
>
>> I tried to run HdfsTest sample on windows spark-1.4.0
>>
>> bin\run-sample org.apache.spark.examples.HdfsTest <file>
>>
>> but got below exception, any body any idea what was wrong here?
>>
>> 15/09/28 16:33:56.565 ERROR SparkContext: Error initializing SparkContext.
>> java.lang.NullPointerException
>>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
>>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
>>         at org.apache.hadoop.util.Shell.run(Shell.java:418)
>>         at
>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
>>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
>>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
>>         at
>> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
>>         at
>> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:467)
>>         at
>> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:130)
>>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:515)
>>         at org.apache.spark.examples.HdfsTest$.main(HdfsTest.scala:32)
>>         at org.apache.spark.examples.HdfsTest.main(HdfsTest.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>
>

Reply via email to