You are receiving this mail as a port that you maintain is failing to build on the FreeBSD package build server. Please investigate the failure and submit a PR to fix build.
Maintainer: de...@freebsd.org Last committer: m...@freebsd.org Ident: $FreeBSD: head/devel/hadoop2/Makefile 412346 2016-04-01 14:00:51Z mat $ Log URL: http://beefy3.nyi.freebsd.org/data/head-i386-default/p412792_s297729/logs/hadoop2-2.7.2.log Build URL: http://beefy3.nyi.freebsd.org/build.html?mastername=head-i386-default&build=p412792_s297729 Log: ====>> Building devel/hadoop2 build started at Sat Apr 9 12:45:39 UTC 2016 port directory: /usr/ports/devel/hadoop2 building for: FreeBSD head-i386-default-job-21 11.0-CURRENT FreeBSD 11.0-CURRENT r297729 i386 maintained by: de...@freebsd.org Makefile ident: $FreeBSD: head/devel/hadoop2/Makefile 412346 2016-04-01 14:00:51Z mat $ Poudriere version: 3.1.12 Host OSVERSION: 1100102 Jail OSVERSION: 1100105 !!! Jail is newer than host. (Jail: 1100105, Host: 1100102) !!! !!! This is not supported. !!! !!! Host kernel must be same or newer than jail. !!! !!! Expect build failures. !!! ---Begin Environment--- SHELL=/bin/csh UNAME_p=i386 UNAME_m=i386 UNAME_v=FreeBSD 11.0-CURRENT r297729 UNAME_r=11.0-CURRENT BLOCKSIZE=K MAIL=/var/mail/root STATUS=1 OPSYS=FreeBSD ARCH=i386 LINUX_OSRELEASE=2.6.32 SAVED_TERM= MASTERMNT=/usr/local/poudriere/data/.m/head-i386-default/ref UID=0 PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin:/root/bin _JAVA_VERSION_LIST_REGEXP=1.6\|1.7\|1.8\|1.6+\|1.7+\|1.8+ POUDRIERE_BUILD_TYPE=bulk PKGNAME=hadoop2-2.7.2 OSREL=11.0 _OSRELEASE=11.0-CURRENT PYTHONBASE=/usr/local OLDPWD=/ _SMP_CPUS=24 PWD=/usr/local/poudriere/data/.m/head-i386-default/ref/.p/pool MASTERNAME=head-i386-default SCRIPTPREFIX=/usr/local/share/poudriere _JAVA_VENDOR_LIST_REGEXP=openjdk\|oracle\|sun USER=root HOME=/root POUDRIERE_VERSION=3.1.12 SCRIPTPATH=/usr/local/share/poudriere/bulk.sh CONFIGURE_MAX_CMD_LEN=262144 LIBEXECPREFIX=/usr/local/libexec/poudriere LOCALBASE=/usr/local PACKAGE_BUILDING=yes _JAVA_OS_LIST_REGEXP=native\|linux OSVERSION=1100105 ---End Environment--- ---Begin OPTIONS List--- ===> The following configuration options are available for hadoop2-2.7.2: EXAMPLES=on: Build and/or install examples ===> Use 'make config' to modify these settings ---End OPTIONS List--- --CONFIGURE_ARGS-- --End CONFIGURE_ARGS-- --CONFIGURE_ENV-- XDG_DATA_HOME=/wrkdirs/usr/ports/devel/hadoop2/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/hadoop2/work HOME=/wrkdirs/usr/ports/devel/hadoop2/work TMPDIR="/tmp" SHELL=/bin/sh CONFIG_SHELL=/bin/sh --End CONFIGURE_ENV-- --MAKE_ENV-- JAVA_HOME=/usr/local/openjdk7 HADOOP_PROTOC_PATH=/usr/local/protobuf25/bin/protoc XDG_DATA_HOME=/wrkdirs/usr/ports/devel/hadoop2/work XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/hadoop2/work HOME=/wrkdirs/usr/ports/devel/hadoop2/work TMPDIR="/tmp" NO_PIE=yes WITHOUT_DEBUG_FILES=yes WITHOUT_KERNEL_SYMBOLS=yes SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local LOCALBASE=/usr/local LIBDIR="/usr/lib" CC="cc" CFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing" CPP="cpp" CPPFLAGS="" LDFLAGS=" -fstack-protector" LIBS="" CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector -fno-strict-aliasing " MANPREFIX="/usr/local" BSD_INSTALL_PROGRAM="install -s -m 555" BSD_INSTALL_LIB="install -s -m 444" BSD_INSTALL_SCRIPT="install -m 555" BSD_INSTALL_DATA="install -m 0644" BSD_INSTALL_MAN="install -m 444" --End MAKE_ENV-- --PLIST_SUB-- PORTVERSION="2.7.2" HADOOP_LOGDIR="/var/log/hadoop" HADOOP_RUNDIR="/var/run/hadoop" HDFS_USER="hdfs" MAPRED_USER="mapred" HADOOP_GROUP="hadoop" JAVASHAREDIR="share/java" JAVAJARDIR="share/java/classes" OSREL=11.0 PREFIX=%D LOCALBASE=/usr/local RESETPREFIX=/usr/local PORTDOCS="" PORTEXAMPLES="" LIB32DIR=lib DOCSDIR="share/doc/hadoop" EXAMPLESDIR="share/examples/hadoop" DATADIR="share/hadoop" WWWDIR="www/hadoop" ETCDIR="etc/hadoop" --End PLIST_SUB-- --SUB_LIST-- HDFS_USER="hdfs" MAPRED_USER="mapred" HADOOP_GROUP="hadoop" JAVA_HOME="/usr/local/openjdk7" HADOOP_LOGDIR="/var/log/hadoop" HADOOP_RUNDIR="/var/run/hadoop" JAVASHAREDIR="/usr/local/share/java" JAVAJARDIR="/usr/local/share/java/classes" JAVALIBDIR="/usr/local/share/java/classes" JAVA_VERSION="1.7" PREFIX=/usr/local LOCALBASE=/usr/local DATADIR=/usr/local/share/hadoop DOCSDIR=/usr/local/share/doc/hadoop EXAMPLESDIR=/usr/local/share/examples/hadoop WWWDIR=/usr/local/www/hadoop ETCDIR=/usr/local/etc/hadoop --End SUB_LIST-- ---Begin make.conf--- MACHINE=i386 MACHINE_ARCH=i386 ARCH=${MACHINE_ARCH} USE_PACKAGE_DEPENDS=yes BATCH=yes WRKDIRPREFIX=/wrkdirs PORTSDIR=/usr/ports PACKAGES=/packages DISTDIR=/distfiles #### /usr/local/etc/poudriere.d/make.conf #### DISABLE_MAKE_JOBS=poudriere ---End make.conf--- =======================<phase: check-sanity >============================ ===> License APACHE20 accepted by the user =========================================================================== =======================<phase: pkg-depends >============================ ===> hadoop2-2.7.2 depends on file: /usr/local/sbin/pkg - not found ===> Installing existing package /packages/All/pkg-1.7.2.txz [head-i386-default-job-21] Installing pkg-1.7.2... [head-i386-default-job-21] Extracting pkg-1.7.2: .......... done ===> hadoop2-2.7.2 depends on file: /usr/local/sbin/pkg - found ===> Returning to build of hadoop2-2.7.2 =========================================================================== =======================<phase: fetch-depends >============================ =========================================================================== =======================<phase: fetch >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by hadoop2-2.7.2 for building =========================================================================== =======================<phase: checksum >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by hadoop2-2.7.2 for building => SHA256 Checksum OK for hadoop/hadoop-2.7.2-src.tar.gz. => SHA256 Checksum OK for hadoop/FreeBSD-hadoop2-2.7.2-maven-repository.tar.gz. => SHA256 Checksum OK for hadoop/apache-tomcat-6.0.44.tar.gz. => SHA256 Checksum OK for hadoop/jetty-6.1.14.zip. =========================================================================== =======================<phase: extract-depends>============================ =========================================================================== =======================<phase: extract >============================ ===> License APACHE20 accepted by the user ===> Fetching all distfiles required by hadoop2-2.7.2 for building ===> Extracting for hadoop2-2.7.2 => SHA256 Checksum OK for hadoop/hadoop-2.7.2-src.tar.gz. => SHA256 Checksum OK for hadoop/FreeBSD-hadoop2-2.7.2-maven-repository.tar.gz. => SHA256 Checksum OK for hadoop/apache-tomcat-6.0.44.tar.gz. => SHA256 Checksum OK for hadoop/jetty-6.1.14.zip. =========================================================================== =======================<phase: patch-depends >============================ =========================================================================== =======================<phase: patch >============================ ===> Patching for hadoop2-2.7.2 ===> Applying FreeBSD patches for hadoop2-2.7.2 /usr/bin/sed -i.bak -e "s#/bin/bash#/usr/local/bin/bash#" /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRJobConfig.java /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/Shell.java /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh =========================================================================== =======================<phase: build-depends >============================ ===> hadoop2-2.7.2 depends on file: /usr/local/share/java/maven3/bin/mvn - not found ===> Installing existing package /packages/All/maven3-3.0.5.txz [head-i386-default-job-21] Installing maven3-3.0.5... [head-i386-default-job-21] `-- Installing maven-wrapper-1_2... [head-i386-default-job-21] `-- Extracting maven-wrapper-1_2: . done [head-i386-default-job-21] `-- Installing openjdk8-8.77.3... [head-i386-default-job-21] | `-- Installing giflib-5.1.3... [head-i386-default-job-21] | `-- Extracting giflib-5.1.3: .......... done [head-i386-default-job-21] | `-- Installing libXt-1.1.5,1... [head-i386-default-job-21] | | `-- Installing xproto-7.0.28... [head-i386-default-job-21] | | `-- Extracting xproto-7.0.28: .......... done [head-i386-default-job-21] | | `-- Installing libSM-1.2.2_3,1... [head-i386-default-job-21] | | `-- Installing libICE-1.0.9_1,1... [head-i386-default-job-21] | | `-- Extracting libICE-1.0.9_1,1: .......... done [head-i386-default-job-21] | | `-- Extracting libSM-1.2.2_3,1: .......... done [head-i386-default-job-21] | | `-- Installing libX11-1.6.3,1... [head-i386-default-job-21] | | `-- Installing kbproto-1.0.7... [head-i386-default-job-21] | | `-- Extracting kbproto-1.0.7: .......... done [head-i386-default-job-21] | | `-- Installing libXdmcp-1.1.2... [head-i386-default-job-21] | | `-- Extracting libXdmcp-1.1.2: ......... done [head-i386-default-job-21] | | `-- Installing libxcb-1.11.1... <snip> Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/org/apache/hadoop/crypto/key/kms/server/class-use/KMS.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/org/apache/hadoop/crypto/key/kms/server/class-use/KMS.KMSOp.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/org/apache/hadoop/crypto/key/kms/server/class-use/KMSJSONReader.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/org/apache/hadoop/crypto/key/kms/server/class-use/KMSJMXServlet.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/org/apache/hadoop/crypto/key/kms/server/package-use.html... Building index for all the packages and classes... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/overview-tree.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/index-all.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/deprecated-list.html... Building index for all classes... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/allclasses-frame.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/allclasses-noframe.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/index.html... Generating /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/help-doc.html... [INFO] Building jar: /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/hadoop-kms/target/hadoop-kms-2.7.2-javadoc.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop Common Project 2.7.2 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-common-project --- [INFO] Executing tasks main: [mkdir] Created dir: /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-common-project/target/test-dir [INFO] Executed tasks [INFO] [INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-common-project --- [INFO] [INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-common-project --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-common-project --- [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-common-project --- [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-common-project --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable package [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS 2.7.2 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/target/test-dir [mkdir] Created dir: /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/target/test/data [INFO] Executed tasks [INFO] [INFO] --- hadoop-maven-plugins:2.7.2:protoc (compile-protoc) @ hadoop-hdfs --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-hdfs --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-hdfs --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 678 source files to /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/target/classes The system is out of resources. Consult the following stack trace for details. java.lang.OutOfMemoryError: Java heap space at com.sun.tools.javac.util.SharedNameTable.fromUtf(SharedNameTable.java:138) at com.sun.tools.javac.util.Names.fromUtf(Names.java:296) at com.sun.tools.javac.util.ByteBuffer.toName(ByteBuffer.java:165) at com.sun.tools.javac.jvm.ClassWriter.typeSig(ClassWriter.java:425) at com.sun.tools.javac.jvm.ClassWriter.writePool(ClassWriter.java:511) at com.sun.tools.javac.jvm.ClassWriter.writeClassFile(ClassWriter.java:1579) at com.sun.tools.javac.jvm.ClassWriter.writeClass(ClassWriter.java:1454) at com.sun.tools.javac.main.JavaCompiler.genCode(JavaCompiler.java:713) at com.sun.tools.javac.main.JavaCompiler.generate(JavaCompiler.java:1451) at com.sun.tools.javac.main.JavaCompiler.generate(JavaCompiler.java:1419) at com.sun.tools.javac.main.JavaCompiler.compile2(JavaCompiler.java:870) at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:829) at com.sun.tools.javac.main.Main.compile(Main.java:439) at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:132) at org.codehaus.plexus.compiler.javac.JavaxToolsCompiler.compileInProcess(JavaxToolsCompiler.java:126) at org.codehaus.plexus.compiler.javac.JavacCompiler.performCompile(JavacCompiler.java:169) at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:785) at org.apache.maven.plugin.compiler.CompilerMojo.execute(CompilerMojo.java:129) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING : [INFO] ------------------------------------------------------------- [WARNING] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:[29,6] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [WARNING] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[53,24] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [WARNING] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[55,18] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [WARNING] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[57,17] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [WARNING] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[59,15] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [INFO] 5 warnings [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [1.593s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [1.469s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [4.460s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.350s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.839s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [4.429s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [3.649s] [INFO] Apache Hadoop Auth ................................ SUCCESS [8.549s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [3.998s] [INFO] Apache Hadoop Common .............................. SUCCESS [3:08.948s] [INFO] Apache Hadoop NFS ................................. SUCCESS [5.753s] [INFO] Apache Hadoop KMS ................................. SUCCESS [14.099s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.049s] [INFO] Apache Hadoop HDFS ................................ FAILURE [3:48.080s] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] hadoop-yarn ....................................... SKIPPED [INFO] hadoop-yarn-api ................................... SKIPPED [INFO] hadoop-yarn-common ................................ SKIPPED [INFO] hadoop-yarn-server ................................ SKIPPED [INFO] hadoop-yarn-server-common ......................... SKIPPED [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED [INFO] hadoop-yarn-server-applicationhistoryservice ...... SKIPPED [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED [INFO] hadoop-yarn-server-tests .......................... SKIPPED [INFO] hadoop-yarn-client ................................ SKIPPED [INFO] hadoop-yarn-server-sharedcachemanager ............. SKIPPED [INFO] hadoop-yarn-applications .......................... SKIPPED [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SKIPPED [INFO] hadoop-yarn-site .................................. SKIPPED [INFO] hadoop-yarn-registry .............................. SKIPPED [INFO] hadoop-yarn-project ............................... SKIPPED [INFO] hadoop-mapreduce-client ........................... SKIPPED [INFO] hadoop-mapreduce-client-core ...................... SKIPPED [INFO] hadoop-mapreduce-client-common .................... SKIPPED [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED [INFO] hadoop-mapreduce-client-app ....................... SKIPPED [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED [INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED [INFO] hadoop-mapreduce .................................. SKIPPED [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED [INFO] Apache Hadoop Distributed Copy .................... SKIPPED [INFO] Apache Hadoop Archives ............................ SKIPPED [INFO] Apache Hadoop Rumen ............................... SKIPPED [INFO] Apache Hadoop Gridmix ............................. SKIPPED [INFO] Apache Hadoop Data Join ........................... SKIPPED [INFO] Apache Hadoop Ant Tasks ........................... SKIPPED [INFO] Apache Hadoop Extras .............................. SKIPPED [INFO] Apache Hadoop Pipes ............................... SKIPPED [INFO] Apache Hadoop OpenStack support ................... SKIPPED [INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED [INFO] Apache Hadoop Azure support ....................... SKIPPED [INFO] Apache Hadoop Client .............................. SKIPPED [INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED [INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED [INFO] Apache Hadoop Tools Dist .......................... SKIPPED [INFO] Apache Hadoop Tools ............................... SKIPPED [INFO] Apache Hadoop Distribution ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 7:50.624s [INFO] Finished at: Sat Apr 09 12:54:08 GMT 2016 [INFO] Final Memory: 61M/255M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-hdfs: Compilation failure: Compilation failure: [ERROR] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:[29,6] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [ERROR] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[53,24] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [ERROR] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[55,18] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [ERROR] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[57,17] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [ERROR] /wrkdirs/usr/ports/devel/hadoop2/work/hadoop-2.7.2-src/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java:[59,15] sun.misc.Unsafe is internal proprietary API and may be removed in a future release [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-hdfs *** Error code 1 Stop. make: stopped in /usr/ports/devel/hadoop2 _______________________________________________ freebsd-pkg-fallout@freebsd.org mailing list https://lists.freebsd.org/mailman/listinfo/freebsd-pkg-fallout To unsubscribe, send any mail to "freebsd-pkg-fallout-unsubscr...@freebsd.org"