I have a server that starts a Spark job using the context creation API. It DOES NOY use spark-submit.
I set spark.submit.deployMode = “cluster” In the GUI I see 2 workers with 2 executors. The link for running application “name” goes back to my server, the machine that launched the job. This is spark.submit.deployMode = “client” according to the docs. I set the Driver to run on the cluster but it runs on the client, *ignoring the spark.submit.deployMode*. Is this as expected? It is documented nowhere I can find.