On Mon, Mar 27, 2017 at 7:03 PM, Josh McKenzie <jmcken...@apache.org> wrote:
> How do we plan on verifying #4? Also, root-cause to tie back new code that > introduces flaky tests (i.e. passes on commit, fails 5% of the time > thereafter) is a non-trivial pursuit (thinking #2 here), and a pretty > common problem in this environment. > > On Mon, Mar 27, 2017 at 6:51 PM, Nate McCall <zznat...@gmail.com> wrote: > > > I don't want to lose track of the original idea from François, so > > let's do this formally in preparation for a vote. Having this all in > > place will make transition to new testing infrastructure more > > goal-oriented and keep us more focused moving forward. > > > > Does anybody have specific feedback/discussion points on the following > > (awesome, IMO) proposal: > > > > Principles: > > > > 1. Tests always pass. This is the starting point. If we don't care > > about test failures, then we should stop writing tests. A recurring > > failing test carries no signal and is better deleted. > > 2. The code is tested. > > > > Assuming we can align on these principles, here is a proposal for > > their implementation. > > > > Rules: > > > > 1. Each new release passes all tests (no flakinesss). > > 2. If a patch has a failing test (test touching the same code path), > > the code or test should be fixed prior to being accepted. > > 3. Bugs fixes should have one test that fails prior to the fix and > > passes after fix. > > 4. New code should have at least 90% test coverage. > > > True #4 is hard to verify in he current state. This was mentioned in a separate thread: If the code was in submodules, the code coverage tools should have less work to do because they typically only count coverage for a module and the tests inside that module. At that point it should be easy to write a plugin on top of something like this: http://alvinalexander.com/blog/post/java/sample-cobertura-ant-build-script. This is also an option: https://about.sonarqube.com/news/2016/05/02/continuous-analysis-for-oss-projects.html