The YARN cluster mode for PySpark is supported since Spark 1.4: https://issues.apache.org/jira/browse/SPARK-5162?jql=project%20%3D%20SPARK%20AND%20text%20~%20%22python%20cluster%22
On Thu, Sep 10, 2015 at 6:54 AM, roy <rp...@njit.edu> wrote: > Hi, > > Is there any way to make spark driver to run in side YARN containers > rather than gateway/client machine. > > At present even with config parameters --master yarn & --deploy-mode > cluster driver runs on gateway/client machine. > > We are on CDH 5.4.1 with YARN and Spark 1.3 > > any help on this ? > > Thanks > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-driver-in-cluster-rather-than-gateway-client-tp24641.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org