Hi,
Remove
.setMaster("spark://spark-437-1-5963003:7077").
set("spark.driver.host","11.104.29.106")
and start over.
Can you also run the following command to check out Spark Standalone:
run-example --master spark://spark-437-1-5963003:7077 SparkPi
Pozdrawiam,
Jacek Laskowski
https://medi
Hi,
I use scala IDE for eclipse. I usually run job against my local spark
installed on my mac and then export the jars and copy it to spark cluster
of my company and run spark submit on it.
This works fine.
But i want to run the jobs from scala ide directly using the spark cluster
of my company.
t