If you're not using spark-submit, then that option does nothing. If by "context creation API" you mean "new SparkContext()" or an equivalent, then you're explicitly creating the driver inside your application.
On Tue, Mar 26, 2019 at 1:56 PM Pat Ferrel <p...@occamsmachete.com> wrote: > > I have a server that starts a Spark job using the context creation API. It > DOES NOY use spark-submit. > > I set spark.submit.deployMode = “cluster” > > In the GUI I see 2 workers with 2 executors. The link for running application > “name” goes back to my server, the machine that launched the job. > > This is spark.submit.deployMode = “client” according to the docs. I set the > Driver to run on the cluster but it runs on the client, ignoring the > spark.submit.deployMode. > > Is this as expected? It is documented nowhere I can find. > -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org