Spark currently supports two build systems, sbt and maven.  sbt will
download the correct version of scala, but with Maven you need to supply it
yourself and set SCALA_HOME.

It sounds like the instructions need to be updated-- perhaps create a JIRA?

best,
Colin


On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com> wrote:

> Hello,
>
> Following the instructions for building spark 1.0.0, I encountered the
> following error:
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project
> spark-core_2.10: An Ant BuildException has occured: Please set the
> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
> variables and retry.
> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
> retry.">... @ 6:126 in
> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
>
> No where in the documentation does it mention that having scala installed
> and either of these env vars set nor what version should be installed.
> Setting these env vars wasn't required for 0.9.1 with sbt.
>
> I was able to get past it by downloading the scala 2.10.4 binary package to
> a temp dir and setting SCALA_HOME to that dir.
>
> Ideally, it would be nice to not have to require people to have a
> standalone scala installation but at a minimum this requirement should be
> documented in the build instructions no?
>
> -Soren
>

Reply via email to