ssue when I am connecting from spark-shell.
> Please tell me how to avoid it.
>
>
>
> 15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the
> hadoop binary path
>
> java.io.IOException: Could not locate executable null\bin\winu
Hi,
I am facing the following issue when I am connecting from spark-shell. Please
tell me how to avoid it.
15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the
Hadoop
cf. https://issues.apache.org/jira/browse/SPARK-2356
On Wed, Oct 29, 2014 at 7:31 PM, Ron Ayoub wrote:
> Apparently Spark does require Hadoop even if you do not intend to use
> Hadoop. Is there a workaround for the below error I get when creating the
> SparkContext in Scala?
>
> I will note that
Date: Wed, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include t
Date: Wed, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include t
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include
the Hadoop jars?
On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub wrote:
> Apparently Spark does require Hadoop even if you do not intend to use
> Had
Apparently Spark does require Hadoop even if you do not intend to use Hadoop.
Is there a workaround for the below error I get when creating the SparkContext
in Scala?
I will note that I didn't have this problem yesterday when creating the Spark
context in Java as part of the getting started App.