Hi,

After running

$ ./bin/start-cluster.sh

The following line of code defaults jobmanager  to localhost:6123

final  ExecutionEnvironment env = Environment.getExecutionEnvironment();

which is same on spark.

val spark =
SparkSession.builder.master(local[*]).appname("anapp").getOrCreate

However if I wish to run the servers on a different physical computer.
Then in Spark I can do it this way using the spark URI in my IDE.

Conf =  SparkConf().setMaster("spark://<hostip>:<port>").setAppName("anapp")

Can you please tell me the equivalent change to make so I can run my
servers and my IDE from different physical computers.

Reply via email to