Hi, I am trying to set up the latest Hadoop-trunk branch for development. To build the distribution I am using the *mvn clean install -DskipTests *command followed by *mvn package -Pdist,native,docs -DskipTests -Dtar .* The maven build passes but I face the following error when trying to run *hdfs namenode -format *:
> Error: A JNI error has occurred, please check your installation and try again > Exception in thread "main" java.lang.NoClassDefFoundError: > org/apache/hadoop/hdfs/protocol/ClientProtocol > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:763) > at > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) > at java.net.URLClassLoader.access$100(URLClassLoader.java:73) > at java.net.URLClassLoader$1.run(URLClassLoader.java:368) > at java.net.URLClassLoader$1.run(URLClassLoader.java:362) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:361) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > at java.lang.Class.getDeclaredMethods0(Native Method) > at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) > at java.lang.Class.privateGetMethodRecursive(Class.java:3048) > at java.lang.Class.getMethod0(Class.java:3018) > at java.lang.Class.getMethod(Class.java:1784) > at > sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544) > at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526) > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.hdfs.protocol.ClientProtocol > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ... 19 more > > I am using a Ubuntu-Xenial vagrant system. The following is the class path which is generated when *--debug* option is used : CLASSPATH: > /vagrant/hadoop-trunk/hadoop-common-project/hadoop-common/target/hadoop-common-3.2.0-SNAPSHOT/etc/hadoop:/vagrant/hadoop-trunk/hadoop-common-project/hadoop-common/target/hadoop-common-3.2.0-SNAPSHOT/share/hadoop/common/lib/*:/vagrant/hadoop-trunk/hadoop-common-project/hadoop-common/target/hadoop-common-3.2.0-SNAPSHOT/share/hadoop/common/*:/vagrant/hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.2.0-SNAPSHOT/share/hadoop/hdfs:/vagrant/hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.2.0-SNAPSHOT/share/hadoop/hdfs/lib/*:/vagrant/hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.2.0-SNAPSHOT/share/hadoop/hdfs/* > and the HADOOP_OPTS HADOOP_OPTS: -Djava.net.preferIPv4Stack=true > -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,RFAS > -Dyarn.log.dir=/vagrant/hadoop-trunk/hadoop-dist/target/hadoop-3.2.0-SNAPSHOT/logs > -Dyarn.log.file=hadoop.log -Dyarn.home.dir=/vagrant/hadoop-trunk/ > -Dyarn.root.logger=INFO,console > -Djava.library.path=/vagrant/hadoop-trunk/hadoop-dist/target/hadoop-3.2.0-SNAPSHOT/lib/native > -Dhadoop.log.dir=/vagrant/hadoop-trunk/hadoop-dist/target/hadoop-3.2.0-SNAPSHOT/logs > -Dhadoop.log.file=hadoop.log > -Dhadoop.home.dir=/vagrant/hadoop-trunk/hadoop-dist/target/hadoop-3.2.0-SNAPSHOT > -Dhadoop.id.str=vagrant -Dhadoop.root.logger=INFO,console > -Dhadoop.policy.file=hadoop-policy.xml > How can I go about fixing this error? Thanks, M.V.S.Chaitanya