Hey Andrew,
Thanks for the response. Is this the issue you're referring to (the
duplicate linked there has an associated patch):
https://issues.apache.org/jira/browse/SPARK-5162 ?
Just to confirm that I understand this: with this patch, Python jobs can be
submitted to YARN, and a node from the cl
Hi Chris,
Short answer is no, not yet.
Longer answer is that PySpark only supports client mode, which means your
driver runs on the same machine as your submission client. By corollary
this means your submission client must currently depend on all of Spark and
its dependencies. There is a patch t