Hi,

I have started a Spark Cluster on EC2 using Spark Standalone cluster
manager but spark is trying to identify the worker threads using the
hostnames which are not accessible publicly.

So when I try to submit jobs from eclipse it is failing, is there some way
spark can use IP address instead of hostnames?

I have used IP address in the slaves file.

Thanks
Ankur

Reply via email to