Hi,

I have a private cluster with private IPs, 192.168.*.*, and a gateway node
with both private IP, 192.168.*.*, and public internet IP.

I setup the Spark master on the gateway node and set the SPARK_MASTER_IP to
the private IP. I start Spark workers on the private nodes. It works fine.

The problem is with spark-shell. I start if from the gateway node with
--master and --conf spark.driver.host using the private IP. The shell
starts alright but when I try to run a job I get Connection refused errors
from RDD.

I checked the Environment for the shell and I noticed that the
spark.fileserver.uri and spark.repl.class.uri are both using the public IP
of the gateway. On the other hand spark.driver.host is using the private IP
as expected.

Setting spark.fileserver.uri or spark.repl.class.uri with --conf does not
help. It seems these values are not read but calculated.

Thanks!
Rares

Reply via email to