Hi Ashish,
For Spark on YARN, you actually only need the Spark files on one machine -
the submission client. This machine could even live outside of the cluster.
Then all you need to do is point YARN_CONF_DIR to the directory containing
your hadoop configuration files (e.g. yarn-site.xml) on that
Can some one please let me know what all i need to configure to have Spark
run using Yarn ,
There is lot of documentation but none of it says how and what all files
needs to be changed
Let say i have 4 node for Spark - SparkMaster , SparkSlave1 , SparkSlave2 ,
SparkSlave3
Now in which node which