cf. https://issues.apache.org/jira/browse/SPARK-2356
On Wed, Oct 29, 2014 at 7:31 PM, Ron Ayoub wrote:
> Apparently Spark does require Hadoop even if you do not intend to use
> Hadoop. Is there a workaround for the below error I get when creating the
> SparkContext in Scala?
>
> I will note that
Date: Wed, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include t
Date: Wed, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include t
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include
the Hadoop jars?
On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub wrote:
> Apparently Spark does require Hadoop even if you do not intend to use
> Had