[jira] [Created] (HADOOP-12840) UGI to log@ debug stack traces when failing to find groups for a user

2016-02-25 Thread Steve Loughran (JIRA)
Steve Loughran created HADOOP-12840:
---

 Summary: UGI to log@ debug stack traces when failing to find 
groups for a user
 Key: HADOOP-12840
 URL: https://issues.apache.org/jira/browse/HADOOP-12840
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: security
Affects Versions: 2.8.0
Reporter: Steve Loughran
Priority: Minor


If {{UGI.getGroupNames()}} catches an IOE raised by 
{{groups.getGroups(getShortUserName())}} then it simply logs @ debug "No groups 
available for user". The text from the caught exception and stack trace are not 
printed.

One IOException raised is the explicit "user not in groups" exception, but 
there could be other causes too —if that happens the entire problem will be 
missed.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


node.js and more as dependencies

2016-02-25 Thread Allen Wittenauer

Hey folks.

Have any of you looked at YARN-3368?  Is adding node.js+a bunch of 
other stuff as dependencies just for the UI a good idea?  Doesn’t that seem 
significantly heavyweight?  How hard is this going to be operationally to 
manage? 

Build failed in Jenkins: Hadoop-Common-trunk #2429

2016-02-25 Thread Apache Jenkins Server
See 

Changes:

[zhz] HDFS-9804. Allow long-running Balancer to login with keytab. Contributed

[zhz] HDFS-9734. Refactoring of checksum failure report related codes.

--
[...truncated 3819 lines...]
Generating 

Building index for all the packages and classes...
Generating 

Generating 

Generating 

Building index for all classes...
Generating 

Generating 

Generating 

Generating 

Generating 

[INFO] Building jar: 

[INFO] 
[INFO] 
[INFO] Building Apache Hadoop MiniKDC 3.0.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-minikdc ---
[INFO] Deleting 

[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-minikdc 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-minikdc ---
[INFO] There are 9 errors reported by Checkstyle 6.6 with 
checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-minikdc ---
[INFO] Executing tasks

main:
[mkdir] Created dir: 

[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-minikdc 
---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hadoop-minikdc ---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-minikdc ---
[INFO] Surefire report directory: 


---
 T E S T S
---

---
 T E S T S
---
Running org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.943 sec - in 
org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Running org.apache.hadoop.minikdc.TestMiniKdc
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.074 sec - in 
org.apache.hadoop.minikdc.TestMiniKdc

Results :

Tests run: 6, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-minikdc ---
[INFO] Building jar: 


Re: node.js and more as dependencies

2016-02-25 Thread Wangda Tan
Hi Allen,

YARN-3368 is using Ember.JS and Ember.JS depends on npm (Node.JS Package
Manager) to manage packages.

One thing to clarify is: npm dependency is only required by build stage (JS
build is stitching source files and renaming variables). After JS build
completes, there's no dependency of Node.JS any more. Server such as RM
only needs to run a HTTP server to host JS files, and browser will take
care of page rendering, just like HDFS/Spark/Mesos UI.

There're a couple of other Apache projects are using Ember.JS, such as
Tez/Ambari. Ember.JS can help front-end developers easier manage models,
pages, events and packages.

Thanks,
Wangda

On Thu, Feb 25, 2016 at 9:16 AM, Allen Wittenauer  wrote:

>
> Hey folks.
>
> Have any of you looked at YARN-3368?  Is adding node.js+a bunch of
> other stuff as dependencies just for the UI a good idea?  Doesn’t that seem
> significantly heavyweight?  How hard is this going to be operationally to
> manage?


[jira] [Created] (HADOOP-12841) Update s3-related properties in core-default.xml

2016-02-25 Thread Wei-Chiu Chuang (JIRA)
Wei-Chiu Chuang created HADOOP-12841:


 Summary: Update s3-related properties in core-default.xml
 Key: HADOOP-12841
 URL: https://issues.apache.org/jira/browse/HADOOP-12841
 Project: Hadoop Common
  Issue Type: Improvement
  Components: fs/s3
Affects Versions: 2.7.0
Reporter: Wei-Chiu Chuang
Assignee: Wei-Chiu Chuang
Priority: Minor


HADOOP-11670 deprecated {fs.s3a.awsAccessKeyId}/{fs.s3a.awsSecretAccessKey} in 
favor of {fs.s3a.access.key}/{fs.s3a.secret.key} in the code, but did not 
update core-default.xml. Also, a few S3 related properties are missing.





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-12842) LocalFileSystem checksum file creation fails when source filename contains a colon

2016-02-25 Thread Plamen Jeliazkov (JIRA)
Plamen Jeliazkov created HADOOP-12842:
-

 Summary: LocalFileSystem checksum file creation fails when source 
filename contains a colon
 Key: HADOOP-12842
 URL: https://issues.apache.org/jira/browse/HADOOP-12842
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.4
Reporter: Plamen Jeliazkov
Assignee: Plamen Jeliazkov
Priority: Minor


In most FileSystems you can create a file with a colon character in it, 
including HDFS. If you try to use the LocalFileSystem implementation (which 
extends ChecksumFileSystem) to create a file with a colon character in it you 
get a URISyntaxException during the creation of the checksum file because of 
the use of {code}new Path(path, checksumFile){code} where checksumFile will be 
considered as a relative path during URI parsing due to starting with a "." and 
containing a ":" in the path.  

Running the following test inside TestLocalFileSystem causes the failure:
{code}
@Test
  public void testColonFilePath() throws Exception {
FileSystem fs = fileSys;
Path file = new Path(TEST_ROOT_DIR + Path.SEPARATOR + "fileWith:InIt");
fs.delete(file, true);
FSDataOutputStream out = fs.create(file);
try {
  out.write("text1".getBytes());
} finally {
  out.close();
}
}
{code}
With the following stack trace:
{code}
java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path 
in absolute URI: .fileWith:InIt.crc
at java.net.URI.checkPath(URI.java:1804)
at java.net.URI.(URI.java:752)
at org.apache.hadoop.fs.Path.initialize(Path.java:201)
at org.apache.hadoop.fs.Path.(Path.java:170)
at org.apache.hadoop.fs.Path.(Path.java:92)
at 
org.apache.hadoop.fs.ChecksumFileSystem.getChecksumFile(ChecksumFileSystem.java:88)
at 
org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:397)
at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at 
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at 
org.apache.hadoop.fs.TestLocalFileSystem.testColonFilePath(TestLocalFileSystem.java:625)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


Jenkins build is back to normal : Hadoop-Common-trunk #2430

2016-02-25 Thread Apache Jenkins Server
See 



Build failed in Jenkins: Hadoop-Common-trunk #2431

2016-02-25 Thread Apache Jenkins Server
See 

Changes:

[xyao] HADOOP-12824. Collect network and disk usage on the node running

--
[...truncated 3821 lines...]
Generating 

Generating 

Generating 

Building index for all classes...
Generating 

Generating 

Generating 

Generating 

Generating 

[INFO] Building jar: 

[INFO] 
[INFO] 
[INFO] Building Apache Hadoop MiniKDC 3.0.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-minikdc ---
[INFO] Deleting 

[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-minikdc 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-minikdc ---
[INFO] There are 9 errors reported by Checkstyle 6.6 with 
checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-minikdc ---
[INFO] Executing tasks

main:
[mkdir] Created dir: 

[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-minikdc 
---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hadoop-minikdc ---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-minikdc ---
[INFO] Surefire report directory: 


---
 T E S T S
---

---
 T E S T S
---
Running org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.14 sec - in 
org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Running org.apache.hadoop.minikdc.TestMiniKdc
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.371 sec - in 
org.apache.hadoop.minikdc.TestMiniKdc

Results :

Tests run: 6, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-minikdc ---
[INFO] Building jar: 

[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-minikdc ---
[INFO] Building jar: 


Build failed in Jenkins: Hadoop-common-trunk-Java8 #1136

2016-02-25 Thread Apache Jenkins Server
See 

Changes:

[xyao] HADOOP-12824. Collect network and disk usage on the node running

--
[...truncated 5505 lines...]
Running org.apache.hadoop.util.TestVersionUtil
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.252 sec - in 
org.apache.hadoop.util.TestVersionUtil
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestProtoUtil
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.324 sec - in 
org.apache.hadoop.util.TestProtoUtil
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestLightWeightGSet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.231 sec - in 
org.apache.hadoop.util.TestLightWeightGSet
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestGSet
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.612 sec - in 
org.apache.hadoop.util.TestGSet
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestStringInterner
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.145 sec - in 
org.apache.hadoop.util.TestStringInterner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestZKUtil
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.152 sec - in 
org.apache.hadoop.util.TestZKUtil
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestStringUtils
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.334 sec - in 
org.apache.hadoop.util.TestStringUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestFindClass
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.847 sec - in 
org.apache.hadoop.util.TestFindClass
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestGenericOptionsParser
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.032 sec - in 
org.apache.hadoop.util.TestGenericOptionsParser
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestRunJar
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.623 sec - in 
org.apache.hadoop.util.TestRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestSysInfoLinux
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.318 sec - in 
org.apache.hadoop.util.TestSysInfoLinux
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.209 sec - in 
org.apache.hadoop.util.TestDirectBufferPool
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestFileBasedIPList
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.222 sec - in 
org.apache.hadoop.util.TestFileBasedIPList
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestIndexedSort
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.823 sec - in 
org.apache.hadoop.util.TestIndexedSort
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestIdentityHashStore
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.22 sec - in 
org.apache.hadoop.util.TestIdentityHashStore
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestMachineList
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.648 sec - in 
org.apache.hadoop.util.TestMachineList
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.TestWinUtils
Tests run: 11, Failures: 0, Errors: 0, Skipped: 11, Time elapsed: 0.266 sec - 
in org.apache.hadoop.util.TestWinUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.util.hash.TestHash
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.45 sec - in 
org.apache.hadoop

Jenkins build is back to normal : Hadoop-common-trunk-Java8 #1137

2016-02-25 Thread Apache Jenkins Server
See 



RE: [crypto][chimera] Next steps

2016-02-25 Thread Chen, Haifeng
Come back to clear out the codebase and IP concerns.

[Benedikt] I still have concerns about the IP, since this seems to be an Intel 
codebase.
I have checked this. Chimera was developed as Apache 2 License from ground up. 
Agree with Jochen that the license matters.
Internally, this was approved as part of larger open source efforts on Apache 
Hadoop and related projects in Intel. We have IP plan considered as part the 
open source process.

As to the codebase, such as the package name is com.intel prefixed, it was 
technical decision when using com.intel.chimera as the package prefix. We 
original planned to use org.apache.chimera prefix. But we found that we 
couldnot publish org.apache. grouped artifacts to maven central repository, 
which needs to somewhat ownership for org.apache domain.

To resolve the codebase problem, once all things are ready from Commons, we 
rename in a branch. And the branched code can be copied into Commons github as 
final. 

Thanks,
Haifeng


-Original Message-
From: Chen, Haifeng [mailto:haifeng.c...@intel.com] 
Sent: Wednesday, February 24, 2016 12:40 PM
To: Commons Developers List 
Cc: common-dev@hadoop.apache.org
Subject: RE: [crypto][chimera] Next steps

>> The same should be there with Chimera/Apache Crypto.
Yes, current implementation will fallback to JCE Cipher if native is not 
available.

[Uma] we would fix up IP issues if any sooner. If you see all the code file 
license header is with Apache License files.
The current repo and package structure there with name Intel. I will check with 
Haifeng on resolution of this.
I will work with this ASAP for clear this out.

Thanks,
Haifeng

-Original Message-
From: Gangumalla, Uma [mailto:uma.ganguma...@intel.com]
Sent: Wednesday, February 24, 2016 5:13 AM
To: Commons Developers List 
Cc: common-dev@hadoop.apache.org
Subject: Re: [crypto][chimera] Next steps

Thanks all for the valuable feedbacks and discussions.
Here are my replies for some of the questions..
[Mark wrote]
It depends. I care less about the quality of the code than I do about the 
community that comes with it / forms around it. A strong community can fix code 
issues. Great code can't save a weak community.
[uma] Nice point. Fully agreed to it.


[Jochen wrote]
Therefore, I suggest that you provide at least fallback implementations in pure 
Java, which are being used, if the JNI based stuff is not available (for 
whatever reason).
[Uma] Thank you for the suggestion Jochen, If I understand your point right,  
Yes its there in Hadoop when we develop.
Here is the JIRA HADOOP-10735 : Fall back AesCtrCryptoCodec implementation from 
OpenSSL to JCE if non native support.

The same should be there with Chimera/Apache Crypto.


[Benedikt]
I still have concerns about the IP, since this seems to be an Intel codebase. I 
do not have the necessary experience to say what would be the right way here. 
My gut feeling tells me, that we should go through the incubator. WDYT?
And [Jochen wrote]
"An Intel codebase" is not a problem as such. Question is: "Available under 
what license?"

[Uma] we would fix up IP issues if any sooner. If you see all the code file 
license header is with Apache License files.
The current repo and package structure there with name Intel. I will check with 
Haifeng on resolution of this.


[Jochen wrote]
So, have the Chimera project attempt to resolve them quickly. If
possible: Fine. If not: We still have the Incubator as a possibility.
[Uma] Agree. We would resolve on this points in sooner.


Regards,
Uma

 
On 2/23/16, 1:18 AM, "Mark Thomas"  wrote:

>On 23/02/2016 09:12, sebb wrote:
>> On 23 February 2016 at 07:34, Benedikt Ritter 
>>wrote:
>>> I'm confused. None of the other PMC members has expressed whether he 
>>>or she  want's the see Chimera/crypto joining Apache Commons, yet 
>>>we're already  discussing how JNI bindings should be handled.
>>>
>>> I'd like to see:
>>> 1) a clear statement whether Chimera/crypto should become part of 
>>>Apache  Commons. Do we need a vote for that?
>> 
>> Yes, of course.
>> 
>> However that decision clearly depends on at least some of the design 
>> aspects of the code.
>> If it were written entirely in C or Fortran, it would not be a 
>> suitable candidate.
>> 
>>> 2) Discuss a plan on how to do that (I've described a possible plan
>>>[1])
>>> 3) After that is clear: discuss design details regarding the component.
>> 
>> Some design details impact on the decision.
>> 
>> Indeed even for pure Java code the code quality has a bearing on 
>> whether Commons would/could want to take it.
>> Would we want a large code base with no unit-tests, no build 
>> mechanism, and no comments?
>
>It depends. I care less about the quality of the code than I do about 
>the community that comes with it / forms around it. A strong community 
>can fix code issues. Great code can't save a weak community.
>
>How about creating a new sandbox component, let folks start work and 
>see how the community develops?
>
>

Build failed in Jenkins: Hadoop-Common-trunk #2432

2016-02-25 Thread Apache Jenkins Server
See 

Changes:

[kasha] HDFS-9858. RollingFileSystemSink can throw an NPE on non-secure

[rkanter] YARN-4579. Allow DefaultContainerExecutor container log directory

--
[...truncated 3755 lines...]
Generating 

Building index for all the packages and classes...
Generating 

Generating 

Generating 

Building index for all classes...
Generating 

Generating 

Generating 

Generating 

Generating 

[INFO] Building jar: 

[INFO] 
[INFO] 
[INFO] Building Apache Hadoop MiniKDC 3.0.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-minikdc ---
[INFO] Deleting 

[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-minikdc 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-minikdc ---
[INFO] There are 9 errors reported by Checkstyle 6.6 with 
checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-minikdc ---
[INFO] Executing tasks

main:
[mkdir] Created dir: 

[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-minikdc 
---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hadoop-minikdc ---
[INFO] Compiling 2 source files to 

[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-minikdc ---
[INFO] Surefire report directory: 


---
 T E S T S
---

---
 T E S T S
---
Running org.apache.hadoop.minikdc.TestMiniKdc
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.556 sec - in 
org.apache.hadoop.minikdc.TestMiniKdc
Running org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.104 sec - in 
org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain

Results :

Tests run: 6, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-minikdc ---
[INFO] Building jar: 


[jira] [Created] (HADOOP-12843) Fix findbugs warnings in hadoop-common (branch-2)

2016-02-25 Thread Akira AJISAKA (JIRA)
Akira AJISAKA created HADOOP-12843:
--

 Summary: Fix findbugs warnings in hadoop-common (branch-2)
 Key: HADOOP-12843
 URL: https://issues.apache.org/jira/browse/HADOOP-12843
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Akira AJISAKA


There are 5 findbugs warnings in branch-2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)