I love that idea.  I'll go that way.

Daniel

On 1/13/16 1:55 PM, Steve Loughran wrote:
Dan

-could you have some tests which write to file:// in hadoop-common? They'd be 
precommit, and any hdfs-ones would be for verifying hdfs compatibility (if you 
really need that final check). If you do it right, you could have common test 
code in hadoop-common-test, which hadoop-hdfs-test could reuse

On 7 Jan 2016, at 08:44, Allen Wittenauer <a...@altiscale.com> wrote:


On Jan 6, 2016, at 12:04 PM, Daniel Templeton <dan...@cloudera.com> wrote:

I'm working on an HDFS sink for metrics2 to allow daemons to log metrics into 
HDFS.  In order to test the new sink, I need access to the MiniDFSCluster, 
which I don't have from the common package, where the code for all the other 
metrics2 sinks lives.

Any objection to putting the unit test code in the HDFS project in the 
o.a.h.metrics2.sink package?
        As long as everyone is ok with precommit effectively being useless for 
this metrics sink since the unit tests are run on a per-modified-module basis….

Reply via email to