> -Original Message-
> From: Marcelo Vanzin [mailto:van...@cloudera.com]
> Hi Ken,
>
> On Mon, Apr 21, 2014 at 1:39 PM, Williams, Ken
> wrote:
> > I haven't figured out how to let the hostname default to the host
> mentioned in our /etc/hadoop/conf/hdfs-site.xml like the Hadoop
> command-l
Hi Ken,
On Mon, Apr 21, 2014 at 1:39 PM, Williams, Ken
wrote:
> I haven't figured out how to let the hostname default to the host mentioned
> in our /etc/hadoop/conf/hdfs-site.xml like the Hadoop command-line tools do,
> but that's not so important.
Try adding "/etc/hadoop/conf" to SPARK_CLASS
e the Hadoop command-line tools do, but
that's not so important.
-Ken
> -Original Message-
> From: Williams, Ken [mailto:ken.willi...@windlogics.com]
> Sent: Monday, April 21, 2014 2:04 PM
> To: Spark list
> Subject: Problem connecting to HDFS in Spark shell
>
>
I'm trying to get my feet wet with Spark. I've done some simple stuff in the
shell in standalone mode, and now I'm trying to connect to HDFS resources, but
I'm running into a problem.
I synced to git's master branch (c399baa - "SPARK-1456 Remove view bounds on
Ordered in favor of a context bou