Hi Patrick,
I used 1.0 branch, but it was not an official release, I just git pulled
whatever was there and compiled.
Thanks,
Mikhail
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/issues-with-bin-spark-shell-for-standalone-mode-tp9107p9206.html
Sent from
Hey Mikhail,
I think (hope?) the -em and -dm options were never in an official
Spark release. They were just in the master branch at some point. Did
you use these during a previous Spark release or were you just on
master?
- Patrick
On Wed, Jul 9, 2014 at 9:18 AM, Mikhail Strebkov wrote:
> Than
Thanks Andrew,
./bin/spark-shell --master spark://10.2.1.5:7077 --total-executor-cores 30
--executor-memory 20g --driver-memory 10g
works well, just wanted to make sure that I'm not missing anything
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/issues-wi
>> "The proper way to specify this is through "spark.master" in your config
or the "--master" parameter to spark-submit."
By "this" I mean configuring which master the driver connects to (not which
port and address the standalone Master binds to).
2014-07-08 16:43 GMT-07:00 Andrew Or :
> Hi Mik
Hi Mikhail,
It looks like the documentation is a little out-dated. Neither is true
anymore. In general, we try to shift away from short options ("-em", "-dm"
etc.) in favor of more explicit ones ("--executor-memory",
"--driver-memory"). These options, and "--cores", refer to the arguments
passed i