Just FYI: what @Marcelo said fixed the issue for me.

On Fri, Mar 6, 2015 at 7:11 AM, Sean Owen <so...@cloudera.com> wrote:

> -Pscala-2.11 and -Dscala-2.11 will happen to do the same thing for this
> profile.
>
> Why are you running "install package" and not just "install"? Probably
> doesn't matter.
>
> This sounds like you are trying to only build core without building
> everything else, which you can't do in general unless you already
> built and installed these snapshot artifacts locally.
>
> On Fri, Mar 6, 2015 at 12:46 AM, Night Wolf <nightwolf...@gmail.com>
> wrote:
> > Hey guys,
> >
> > Trying to build Spark 1.3 for Scala 2.11.
> >
> > I'm running with the folllowng Maven command;
> >
> > -DskipTests -Dscala-2.11 clean install package
> >
> >
> > Exception:
> >
> > [ERROR] Failed to execute goal on project spark-core_2.10: Could not
> resolve
> > dependencies for project
> > org.apache.spark:spark-core_2.10:jar:1.3.0-SNAPSHOT: The following
> artifacts
> > could not be resolved:
> > org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT,
> > org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT: Failure
> to
> > find org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT in
> > http://repository.apache.org/snapshots was cached in the local
> repository,
> > resolution will not be reattempted until the update interval of
> > apache.snapshots has elapsed or updates are forced -> [Help 1]
> >
> >
> > I see these warnings in the log before this error:
> >
> >
> > [INFO]
> > [INFO]
> > ------------------------------------------------------------------------
> > [INFO] Building Spark Project Core 1.3.0-SNAPSHOT
> > [INFO]
> > ------------------------------------------------------------------------
> > [WARNING] The POM for
> > org.apache.spark:spark-network-common_2.11:jar:1.3.0-SNAPSHOT is
> missing, no
> > dependency information available
> > [WARNING] The POM for
> > org.apache.spark:spark-network-shuffle_2.11:jar:1.3.0-SNAPSHOT is
> missing,
> > no dependency information available
> >
> >
> > Any ideas?
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to