, 2015 12:14 PM
To: dev@spark.apache.org
Subject: Re: spark-shell 1.5 doesn't seem to work in local mode
Thanks guys.
I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind.
Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some
reason.
To the best of my kn
Thanks guys.
I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind.
Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some
reason.
To the best of my knowledge, I just downloaded each tgz and untarred them in
/opt
I adjusted my PATH to point to one or the other,
It sounds a lot like you have some local Hadoop config pointing to a
cluster, and you're picking that up when you run the shell. Look for
HADOOP_* env variables and clear them, and use --master local[*]
On Sat, Sep 19, 2015 at 5:14 PM, Madhu wrote:
> I downloaded spark-1.5.0-bin-hadoop2.6.tgz rec
Maybe you have a hdfs-site.xml lying around somewhere?
On Sat, Sep 19, 2015 at 9:14 AM, Madhu wrote:
> I downloaded spark-1.5.0-bin-hadoop2.6.tgz recently and installed on
> CentOS.
> All my local Spark code works fine locally.
>
> For some odd reason, spark-shell doesn't work in local mode.
>
I downloaded spark-1.5.0-bin-hadoop2.6.tgz recently and installed on CentOS.
All my local Spark code works fine locally.
For some odd reason, spark-shell doesn't work in local mode.
It looks like it want's to connect to HDFS, even if I use --master local or
specify local mode in the conf.
Even sc.