To add to this:

*Went well*
Tyler Hobbs has reduced failing dtests on trunk by ~90%. By next month,
test results should be at 100% pass.

*Went poorly*
We've failed to make progress on running the full test suite across all
contributor branches. By the end of this month, I assume we will at least
have limited functionality in this area.

On Wed, Apr 1, 2015 at 3:57 PM, Ariel Weisberg <ariel.weisb...@datastax.com>
wrote:

> Hi all,
>
> It’s time for the first retrospective. For those not familiar this is the
> part of the development process where we discuss what is and isn’t working
> when it comes to making reliable releases. We go over the things that
> worked, the things that didn’t work, and what changes we are going to make.
>
> This is not a forum for discussing individual bugs (or bugs fixed before
> release due to successful process) although you can cite one and we can
> discuss what we could have done differently to catch it. Even if a bug
> wasn’t released if it was caught the wrong way (blind luck) and you think
> our process wouldn’t have caught it you can bring that up as well.
>
> I don’t expect this retrospective to be the most productive because we
> already know we are far behind in several areas (passing utests, dtests,
> running utests and dtests for on each commit, running a larger black box
> system test) and many issues will circle back around to being addressed by
> one of those three.
>
> If your a developer you can review all things you have committed (or
> reviewed) in the past month and ask yourself if it met the criteria of done
> that we agreed on including adding tests for existing untested code
> (usually the thing missed). Better to do it now then after discovering your
> definition of done was flawed because it released a preventible bug.
>
> For this one retrospective you can reach back further to something already
> released that you feel passionate about, and if you can point to a utest or
> dtest that should have caught it that is still missing we can add that to
> the list of things to test. That would go under CASSANDRA-9012 (Triage
> missing test coverage) <
> https://issues.apache.org/jira/browse/CASSANDRA-9012>.
>
> There is a root JIRA <https://issues.apache.org/jira/browse/CASSANDRA-9042>
> for making trunk always releasable. A lot falls under CASSANDRA-9007 ( Run
> stress nightly against trunk in a way that validates ) <
> https://issues.apache.org/jira/browse/CASSANDRA-9007> which is the root
> for a new kitchen sink style test that validates the entire feature set
> together in a black box fashion. Philip Thompson has a basic job running so
> we are close to (or at) the tipping point where the doneness criteria for
> every ticket needs to include making sure this job covers the thing you
> added/changed. If you aren’t going to add the coverage you need to justify
> (to yourself and your reviewer) breaking it out into something separate and
> file a JIRA indicating the coverage was missing (if one doesn’t already
> exist). Make sure to link it to 9007 so we can see what has already been
> reported.
>
> The reason I say we might not be at the tipping point is that while we
> have the job we haven’t ironed out how stress (or something new) will act
> as a container for validating multiple features. Especially in an
> environment where things like cluster/node failures and topology changes
> occur.
>
> Retrospectives aren’t supposed to include the preceding paragraphs we
> should funnel discussion about them into a separate email thread.
>
> On to the retrospective. This is more for me to solicit from information
> from you then for me to push information to you.
>
> Went well
> Positive response to the definition of done
> Lot’s of manpower from QA and progress on test infrastructure
> Went poorly
> Some wanting to add validation to a kitchen sink style test, but not being
> able to yet
> Not having a way to know if we are effectively implementing the definition
> of done without waiting for bugs as feedback
> Changes
> Coordinate with Philip Thompson to see how we can get to having developers
> able to add validation to the kitchen sink style test
>
> Regards,
> Ariel

Reply via email to