Hi,I am sophia. I followed the blog from the Internet to configure and test spark on Yarn,which has configue hadoop 2.0.0-CDH4.The spark version is 0.9.1,the scala version is 2.11.0-RC4
cd spark-0.9.1 SPARK_HADOOP_VERSION=2.0.0-cdh4.2.1 SPARK_YARN=true sbt/sbt assembly This cannot work,Invalid or corrupt sbt/sbt-launch-0.12.4.jar.What can I do? what should I do about the script above,will I append the script in the /bin/bash file? #! /bin/bash export YARN_CONF_DIR=/opt/hadoop-2.2.0/etc/hadoop export SPARK_JAR=./assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar \ -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-run-spark-well-on-yarn-tp5023.html Sent from the Apache Spark User List mailing list archive at Nabble.com.