Thanks Petar and Akhil for the suggestion.
Actually I changed the SPARK_MASTER_IP to the internal-ip, deleted the
"export SPARK_PUBLIC_DNS=xx" line in the spark-env.sh and also edited
the /etc/hosts as Akhil suggested, and now it is working! However I don't
know which change actually makes it
Hi Akhil,
Thanks for the explanation! I could ping the worker from the master using
either host name or internal-ip, however I am a little confused why setting
SPARK_LOCAL_IP would help?
Thanks!
Anny
On Tue, Mar 31, 2015 at 10:36 AM, Akhil Das
wrote:
> When you say you added , where you able
When you say you added , where you able to ping any
of these from the machine?
You could try setting SPARK_LOCAL_IP on all machines. But make sure you
will be able to bind to that host/ip specified there.
Thanks
Best Regards
On Tue, Mar 31, 2015 at 10:49 PM, Anny Chen wrote:
> Hi Akhil,
>
>
Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?
On 31.3.2015. 19:19, Anny Chen wrote:
Hi Akhil,
I tried editing the /etc/hosts on the master and on the workers, and
seems it is not working for me.
I tried adding and it didn't work. I then
tried adding and it didn't
Hi Akhil,
I tried editing the /etc/hosts on the master and on the workers, and seems
it is not working for me.
I tried adding and it didn't work. I then tried
adding and it didn't work either. I guess I should
also edit the spark-env.sh file?
Thanks!
Anny
On Mon, Mar 30, 2015 at 11:15 PM, A
You can add an internal ip to public hostname mapping in your /etc/hosts
file, if your forwarding is proper then it wouldn't be a problem there
after.
Thanks
Best Regards
On Tue, Mar 31, 2015 at 9:18 AM, anny9699 wrote:
> Hi,
>
> For security reasons, we added a server between my aws Spark Cl