you can build your uber jar file on an NFS mounted file system accessible
to all nodes in the cluster. Any node then can start-submit and run the app
referring to the jar file.
sounds doable.
Having thought about it, it is feasible to place Spark binaries on the NFS
mount as well so any host can
Hello,
Is there a way to run a spark submit job that points to the URL of a jar file (instead of pushing the jar from local)?
The documentation at http://spark.apache.org/docs/latest/submitting-applications.html implies that this may be possible.
"application-jar: Path to a bundled jar inc