I am not sure if it is exposed in the SBT build, but you may need the
equivalent of the 'yarn-alpha' profile from the Maven build. This
older build of CDH predates the newer YARN APIs.

See also 
https://groups.google.com/forum/#!msg/spark-users/T1soH67C5M4/CmGYV8kfRkcJ

Or, use a later CDH. In fact 4.6+ has a Spark parcel for you already.

On Wed, Jun 4, 2014 at 10:13 AM, ch huang <[email protected]> wrote:
> hi,maillist:
>         i try to compile spark ,but failed, here is my compile command and
> compile output
>
>
>
> # SPARK_HADOOP_VERSION=2.0.0-cdh4.4.0 SPARK_YARN=true sbt/sbt assembly
>
> [warn] 18 warnings found
> [info] Compiling 53 Scala sources and 1 Java source to
> /home/admserver/spark-1.0.0/sql/catalyst/target/scala-2.10/classes...
> [info] Compiling 68 Scala sources and 2 Java sources to
> /home/admserver/spark-1.0.0/streaming/target/scala-2.10/classes...
> [info] Compiling 62 Scala sources and 1 Java source to
> /home/admserver/spark-1.0.0/mllib/target/scala-2.10/classes...
> [info] Compiling 14 Scala sources to
> /home/admserver/spark-1.0.0/yarn/alpha/target/scala-2.10/classes...
> [error]
> /home/admserver/spark-1.0.0/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:36:
> object AMResponse is not a member of package
> org.apache.hadoop.yarn.api.records
> [error] import org.apache.hadoop.yarn.api.records.{AMResponse,
> ApplicationAttemptId}
> [error]        ^
> [error]
> /home/admserver/spark-1.0.0/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:110:
> value getAMResponse is not a member of
> org.apache.hadoop.yarn.api.protocolrecords.AllocateResponse
> [error]     val amResp =
> allocateExecutorResources(executorsToRequest).getAMResponse
> [error]                                                                ^
> [error] two errors found
>
>
> [error] (yarn-alpha/compile:compile) Compilation failed
> [error] Total time: 1815 s, completed Jun 4, 2014 5:07:56 PM

Reply via email to