Hi, while I was testing an example, I have encountered a problem in running
Scala on cluster. I searched it on Google but couldn't solve it and posted
about it on spark mailing list but it couldn't help me solve the problem as
well.

The problem is that I could run Spark successfully in local mode, but when I
run an example Scala program from Spark examples in distributed mode, it is
giving some errors as shown in the attached picture (capture1). 

The following works fine:
$> ./bin/run-example org.apache.spark.examples.SparkPi local

but the following command gives an error as shown in the attached picture.
$> ./bin/run-example org.apache.spark.examples.SparkPi spark://MyHostIP:7077 

Here, I have attached some screenshots. Please consider the following
pictures. I hope you can help me. 

pic1: shows to launch a spark example application using hadoop cluster.
pic2: shows to run a simple application named SimpleApp.class, which is
located in test.scala
pic3: shows test.scala code, which is located in
/home/exobrain/install/spark-0.9.1 director
pic4: runner.sbt, (package builder) is located in the same director with
test.scala

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic2.png> 
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic3.png> 
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic4.png> 
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic1.png> 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/getting-an-error-tp4933.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to