QQ - did you download the Spark 1.1 binaries that included the Hadoop one? Does this happen if you're using the Spark 1.1 binaries that do not include the Hadoop jars?
On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub <ronalday...@live.com> wrote: > Apparently Spark does require Hadoop even if you do not intend to use > Hadoop. Is there a workaround for the below error I get when creating the > SparkContext in Scala? > > I will note that I didn't have this problem yesterday when creating the > Spark context in Java as part of the getting started App. It could be > because I was using Maven project to manage dependencies and that did > something for me or else JavaSparkContext has some different code. > > I would say, in order for Spark to be general purpose this is a pretty big > bug since now it appears Spark depends upon Hadoop. > > "Could not locate executable null\bin\winutils.exe in the Hadoop binaries" > > >