Some more update.
Now, I tried with by setting spark.driver.host to Spark Master node and
spark.driver.port to 51800 (available open port), but its failing with bind
error. I was hoping that it will start the driver on supplied host:port and
as its unix node there should not be any issue.
Can you
Hello,
I am able to submit Job on Spark cluster from Windows desktop. But the
executors are not able to run.
When I check the Spark UI (which is on Windows, as Driver is there) it shows
me JAVA_HOME, CLASS_PATH and other environment variables related to Windows.
I tried by setting spark.executor.e