Re: Can't run Spark java code from command line

2015-01-13 Thread Ye Xianjin
There is no binding issue here. Spark picks the right ip 10.211.55.3 for you. The printed message is just an indication. However I have no idea why spark-shell hangs or stops. 发自我的 iPhone > 在 2015年1月14日,上午5:10,Akhil Das 写道: > > It just a binding issue with the hostnames in your /etc/hosts fil

Re: Can't run Spark java code from command line

2015-01-13 Thread Akhil Das
It just a binding issue with the hostnames in your /etc/hosts file. You can set SPARK_LOCAL_IP and SPARK_MASTER_IP in your conf/spark-env.sh file and restart your cluster. (in that case the spark://myworkstation:7077 will change to the ip address that you provided eg: spark://10.211.55.3). Thanks

Can't run Spark java code from command line

2015-01-13 Thread jeremy p
Hello all, I wrote some Java code that uses Spark, but for some reason I can't run it from the command line. I am running Spark on a single node (my workstation). The program stops running after this line is executed : SparkContext sparkContext = new SparkContext("spark://myworkstation:7077", "s