Thanks for your reply.

Is there a way to find the correct Hadoop profile name?

On Fri, Jul 29, 2016 at 7:06 AM, Sean Owen <so...@cloudera.com> wrote:

> You have at least two problems here: wrong Hadoop profile name, and
> some kind of firewall interrupting access to the Maven repo. It's not
> related to Spark.
>
> On Thu, Jul 28, 2016 at 4:04 PM, Ascot Moss <ascot.m...@gmail.com> wrote:
> > Hi,
> >
> > I tried to build spark,
> >
> > (try 1)
> > mvn -Pyarn -Phadoop-2.7.0 -Dscala-2.11 -Dhadoop.version=2.7.0 -Phive
> > -Phive-thriftserver -DskipTests clean package
> >
> > [INFO] Spark Project Parent POM ........................... FAILURE [
> 0.658
> > s]
> >
> > [INFO] Spark Project Tags ................................. SKIPPED
> >
> > [INFO] Spark Project Sketch ............................... SKIPPED
> >
> > [INFO] Spark Project Networking ........................... SKIPPED
> >
> > [INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
> >
> > [INFO] Spark Project Unsafe ............................... SKIPPED
> >
> > [INFO] Spark Project Launcher ............................. SKIPPED
> >
> > [INFO] Spark Project Core ................................. SKIPPED
> >
> > [INFO] Spark Project GraphX ............................... SKIPPED
> >
> > [INFO] Spark Project Streaming ............................ SKIPPED
> >
> > [INFO] Spark Project Catalyst ............................. SKIPPED
> >
> > [INFO] Spark Project SQL .................................. SKIPPED
> >
> > [INFO] Spark Project ML Local Library ..................... SKIPPED
> >
> > [INFO] Spark Project ML Library ........................... SKIPPED
> >
> > [INFO] Spark Project Tools ................................ SKIPPED
> >
> > [INFO] Spark Project Hive ................................. SKIPPED
> >
> > [INFO] Spark Project REPL ................................. SKIPPED
> >
> > [INFO] Spark Project YARN Shuffle Service ................. SKIPPED
> >
> > [INFO] Spark Project YARN ................................. SKIPPED
> >
> > [INFO] Spark Project Hive Thrift Server ................... SKIPPED
> >
> > [INFO] Spark Project Assembly ............................. SKIPPED
> >
> > [INFO] Spark Project External Flume Sink .................. SKIPPED
> >
> > [INFO] Spark Project External Flume ....................... SKIPPED
> >
> > [INFO] Spark Project External Flume Assembly .............. SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> >
> > [INFO] Spark Project Examples ............................. SKIPPED
> >
> > [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> >
> > [INFO] Spark Project Java 8 Tests ......................... SKIPPED
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] BUILD FAILURE
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] Total time: 1.090 s
> >
> > [INFO] Finished at: 2016-07-29T07:01:57+08:00
> >
> > [INFO] Final Memory: 30M/605M
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [WARNING] The requested profile "hadoop-2.7.0" could not be activated
> > because it does not exist.
> >
> > [ERROR] Plugin org.apache.maven.plugins:maven-site-plugin:3.3 or one of
> its
> > dependencies could not be resolved: Failed to read artifact descriptor
> for
> > org.apache.maven.plugins:maven-site-plugin:jar:3.3: Could not transfer
> > artifact org.apache.maven.plugins:maven-site-plugin:pom:3.3 from/to
> central
> > (https://repo1.maven.org/maven2):
> sun.security.validator.ValidatorException:
> > PKIX path building failed:
> > sun.security.provider.certpath.SunCertPathBuilderException: unable to
> find
> > valid certification path to requested target -> [Help 1]
> >
> > [ERROR]
> >
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > switch.
> >
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >
> > [ERROR]
> >
> > [ERROR] For more information about the errors and possible solutions,
> please
> > read the following articles:
> >
> > [ERROR] [Help 1]
> >
> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
> >
> >
> > (try 2)
> >
> > ./build/mvn -DskipTests clean package
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] Reactor Summary:
> >
> > [INFO]
> >
> > [INFO] Spark Project Parent POM ........................... FAILURE [
> 0.653
> > s]
> >
> > [INFO] Spark Project Tags ................................. SKIPPED
> >
> > [INFO] Spark Project Sketch ............................... SKIPPED
> >
> > [INFO] Spark Project Networking ........................... SKIPPED
> >
> > [INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
> >
> > [INFO] Spark Project Unsafe ............................... SKIPPED
> >
> > [INFO] Spark Project Launcher ............................. SKIPPED
> >
> > [INFO] Spark Project Core ................................. SKIPPED
> >
> > [INFO] Spark Project GraphX ............................... SKIPPED
> >
> > [INFO] Spark Project Streaming ............................ SKIPPED
> >
> > [INFO] Spark Project Catalyst ............................. SKIPPED
> >
> > [INFO] Spark Project SQL .................................. SKIPPED
> >
> > [INFO] Spark Project ML Local Library ..................... SKIPPED
> >
> > [INFO] Spark Project ML Library ........................... SKIPPED
> >
> > [INFO] Spark Project Tools ................................ SKIPPED
> >
> > [INFO] Spark Project Hive ................................. SKIPPED
> >
> > [INFO] Spark Project REPL ................................. SKIPPED
> >
> > [INFO] Spark Project Assembly ............................. SKIPPED
> >
> > [INFO] Spark Project External Flume Sink .................. SKIPPED
> >
> > [INFO] Spark Project External Flume ....................... SKIPPED
> >
> > [INFO] Spark Project External Flume Assembly .............. SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> >
> > [INFO] Spark Project Examples ............................. SKIPPED
> >
> > [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> >
> > [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> >
> > [INFO] Spark Project Java 8 Tests ......................... SKIPPED
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] BUILD FAILURE
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [INFO] Total time: 1.067 s
> >
> > [INFO] Finished at: 2016-07-29T07:02:52+08:00
> >
> > [INFO] Final Memory: 29M/605M
> >
> > [INFO]
> > ------------------------------------------------------------------------
> >
> > [ERROR] Plugin org.apache.maven.plugins:maven-site-plugin:3.3 or one of
> its
> > dependencies could not be resolved: Failed to read artifact descriptor
> for
> > org.apache.maven.plugins:maven-site-plugin:jar:3.3: Could not transfer
> > artifact org.apache.maven.plugins:maven-site-plugin:pom:3.3 from/to
> central
> > (https://repo1.maven.org/maven2):
> sun.security.validator.ValidatorException:
> > PKIX path building failed:
> > sun.security.provider.certpath.SunCertPathBuilderException: unable to
> find
> > valid certification path to requested target -> [Help 1]
> >
> > [ERROR]
> >
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > switch.
> >
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >
> > [ERROR]
> >
> > [ERROR] For more information about the errors and possible solutions,
> please
> > read the following articles:
> >
> > [ERROR] [Help 1]
> >
> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
> >
> >
> > Any idea? What would be the build command for 2.0 now?
> >
> >
> > Regards
> >
> >
> >
> >
>

Reply via email to