update - seems 'spark-shell' does not support mode -> yarn-cluster (i guess
since it is an interactive shell)
The only modes supported include -> yarn-client & local
Pls let me know if my understanding is incorrect.
Thanks!
On Sun, Aug 6, 2017 at 10:07 AM, karan alang wrote:
> Hello all - i'd
Hello all - i'd a basic question on the modes in which spark-shell can be
run ..
when i run the following command,
does Spark run in local mode i.e. outside of YARN & using the local cores ?
(since '--master' option is missing)
./bin/spark-shell --driver-memory 512m --executor-memory 512m
Simila