Great idea Jie! That is a very common problem.. I have documented it under
"A test fails with a NullPointerException in MiniDFSCluster"


On Wed, Nov 13, 2013 at 8:50 PM, Jie Jin <[email protected]> wrote:

> Hi, Brock
>
> When you update these pages, could you add the information that we need to
> set
> umask to 0022 to pass the unit test?
> In most Linux distribution, the default umask is 0002. If we run mvn test,
> we will met the following error.
> It will be helpful to beginners, If you mention the umask issue in the FAQ.
>
> Tests run: 10, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 3.058 sec
> <<< FAILURE! - in org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils
>
> determineSchemaCanReadSchemaFromHDFS(org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils)
> Time elapsed: 2.58 sec  <<< ERROR!
> java.lang.NullPointerException: null
>     at
>
> org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:426)
>     at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:284)
>     at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:124)
>     at
>
> org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils.determineSchemaCanReadSchemaFromHDFS(TestAvroSerdeUtils.java:187)
>
>
>
> Best Regards
> 金杰 (Jie Jin)
>
>
> On Thu, Nov 14, 2013 at 4:24 AM, Brock Noland <[email protected]> wrote:
>
> > Hi,
> >
> > Thanks for the report, I will update that page. We now use maven, and
> > instructions are in FAQ for here:
> >
> > https://cwiki.apache.org/confluence/display/Hive/HiveDeveloperFAQ
> >
> >
> > On Wed, Nov 13, 2013 at 2:22 PM, Paul Rubio <[email protected]> wrote:
> >
> > > From the document at "How to Contribute to Apache Hive" <http://How to
> > > Contribute to Apache Hive>, I'm having trouble locating any ant
> specific
> > > build.xml files.
> > >
> > > After pulling from hive truck via:
> > > svn checkout http://svn.apache.org/repos/asf/hive/trunk hive-trunk
> > >
> > > I then attempted to execute the first line from
> > > GettingStarted+EclipseSetup<
> > >
> >
> https://cwiki.apache.org/confluence/display/Hive/GettingStarted+EclipseSetup
> > > >
> > > :
> > > -->  ant clean package eclipse-files
> > > -->  Buildfile: build.xml does not exist!
> > > -->  Build failed
> > >
> > > However, the build.xml file isn't located at the top-level directory.
> > >  There are build files located in test directories:
> > >
> > > ./hcatalog/src/test/e2e/harness/build.xml
> > > ./hcatalog/src/test/e2e/hcatalog/build.xml
> > > ./hcatalog/src/test/e2e/hcatalog/tools/generate/java/build.xml
> > > ./hcatalog/src/test/e2e/hcatalog/udfs/java/build.xml
> > > ./hcatalog/src/test/e2e/templeton/build.xml
> > >
> > > Is ant used to build Hive?  What's the current procedure for building
> > Hive?
> > >
> > > Any help would be appreciated.
> > > Thanks,
> > > Paul
> > >
> >
> >
> >
> > --
> > Apache MRUnit - Unit testing MapReduce - http://mrunit.apache.org
> >
>



-- 
Apache MRUnit - Unit testing MapReduce - http://mrunit.apache.org

Reply via email to