I put up a pull request with documentation changes
https://github.com/apache/spark/pull/314
Tom
On Wednesday, April 2, 2014 8:47 AM, Tom Graves wrote:
Note I'm +1 with the doc changed to tell users to export SPARK_YARN_MODE=true
before using spark-shell on yarn.
I tested it on both hadoop
Yes, driver is a client. Either single or different machine will be OK for
driver and master
-Original Message-
From: Dan [mailto:zsh912...@gmail.com]
Sent: Thursday, April 03, 2014 9:56 PM
To: d...@spark.incubator.apache.org
Subject: Re: The difference between driver and master in Spark
Can I think driver as client? Driver and master can be located in a single
machine or different machines, right?
Thanks,
Dan
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/The-difference-between-driver-and-master-in-Spark-tp6158p6192.html
Sent fro