Try to run the spark-shell in standalone mode
(MASTER=spark://yourmasterurl:7077 $SPARK_HOME/bin/spark-shell), and do a
small count ( val d = sc.parallelize(1 to 1000).count()), If that is
failing, then something is wrong with your cluster setup as its saying
Connection refused: node001/10.180.49.2
Hi,
I get exactly the same error. It runs on my local machine but not on the
cluster. I am running the example pi.py example.
Best,
Tassilo
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/stage-failure-java-lang-IllegalStateException-unread-block-data-tp1
The worker side has error message as this,
14/10/30 18:29:00 INFO Worker: Asked to launch executor
app-20141030182900-0006/0 for testspark_v1
14/10/30 18:29:01 INFO ExecutorRunner: Launch command: "java" "-cp"
"::/root/spark-1.1.0/conf:/root/spark-1.1.0/assembly/target/scala-2.10/spark-assembly-1.