Hi,
Following the "" document:
# Cloudera CDH 4.2.0
mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -DskipTests clean package
I compile Spark 1.0.2 with this cmd:
mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.6.0 -DskipTests clean package
However, I got two errors:
[INFO] Compiling 14 Scala sources to
/Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/target/scala-2.10/classes...
[ERROR]
/Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:36:
object AMResponse is not a member of package
org.apache.hadoop.yarn.api.records
[ERROR] import org.apache.hadoop.yarn.api.records.{AMResponse,
ApplicationAttemptId}
[ERROR] ^
[ERROR]
/Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:114:
value getAMResponse is not a member of
org.apache.hadoop.yarn.api.protocolrecords.AllocateResponse
[ERROR] val amResp =
allocateExecutorResources(executorsToRequest).getAMResponse
[ERROR] ^
[ERROR] two errors found
[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [2.243s]
[INFO] Spark Project YARN Parent POM ..................... SUCCESS [4.139s]
[INFO] Spark Project YARN Alpha API ...................... FAILURE [8.906s]
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 15.587s
[INFO] Finished at: Thu Aug 07 23:06:21 CST 2014
[INFO] Final Memory: 21M/145M
[INFO]
------------------------------------------------------------------------
Anyone can help solve this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Compile-error-Spark-1-0-2-against-cloudera-2-0-0-cdh4-6-0-error-tp11656.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]