Hi,

I took these directly from the Etherpad we used to capture meeting
minutes of the OPNFV Dovetail team today:

https://etherpad.opnfv.org/p/collabrationofdovetail

# Dovetail meetup in Openstack Barcelona

TIME: Tuesday October 25, 1:00 - up to 3:00 PM Local time
PLACE: lunch area, we will try to find a table and send location
description out via email, irc (opnfv-meeting) and this page. Please
check shortly before 1:00PM.

place for the meeting: P2(213) - lunch buffet hall in P2 - we will be at
the table at the far left near the window as you come in. see you at 1pm.

Attendance:
* Wenjing Chu
* Chris Price
* Hongbo Tian
* Dave Neary
* Tim Ihrnich
* Leo Wang
* Serena (Xiaoewei) Feng
* Leif Madsen
* Luke Hinds
* +1 (did not get name)

Meeting notes:
  * Test case template review
    * Chris's proposal was based on IEEE template
    * Wants a discussion on which pieces of the template are useful, and
which are not
    * Needs longer conversation - what do we expect people to be doing?
    * Need a good example + pointer to requirements to reinforce best
practices
    * What are we trying to accomplish?
  * Documentation of test cases
    * Don't want Dovetail team to take all the load - want testing teams
to propose candidate test cases for inclusion in the Dovetail suite,
which we can then massage into shape for inclusion
    * Creating a framework for distribution of the work - need to
improve how the user manual is put together too in the docs team,
Dovetail can follow what the docs team is doing in the next iteration
    * Chris will work on language and specification for Dovetail test cases
    * Need to easily provide a linting of test case proposals to allow
people to know where they stand before submitting. Chris says he wants
to be able to read the test case and do the test at the same time (so
test cases can read like a manual test script, in addition to an
automated test
  * Test coverage
    * Need to work with the testing working group to make it easier for
people to run tests locally, document how to add test cases and document
how to run test suites better for community adoption and use
    * As part of that, we need to work with the testing team, who will
propose candidate test cases for Dovetail inclusion
  * Other test case requirements
    * Security review? Require test case submitters to include a
security impact section at time of proposal
    * Also need to ensure that we are not requiring insecure interfaces
or features as part of OPNFV certification
  * Release of dovetail
    * Frank proposed to release the C certification at the same time as
D, Wenjing proposes aligning with Colorado 3.0 (3 months after Colorado)
    * Chris says we need to know what we need to release first, then
once we have that we can release
      * Suggests that there's a lot of work for a testing project after
releasing for Dovetail requirements, and that 3.0 is a good target
aspirationally, but doesn't work for our first release until we know
what we need for the first release
    * Wenjing asks for a commitment to a calendar, Dave reminds that the
C&C Committee expects a test suite for the Plugfest in Novemer, Chris
says calendar should not matter, we need to understand what will be in
Dovetail first.
    * Chris wants to have a loose coupling of the Dovetail certification
process from the "plugfest test suite" tool and test cases - test cases
can be added to the test tool without being part of the certification spec
  * Terminology guide for Dovetail
    * We are using terms like scenario incompatibly with the OPNFV usage
- we should do a Dovetail lexicon with Dovetail terms and concepts and
definitions - need to put together a wiki page, and ensure all of the
test tools and wiki pages are consistent in usage
  * Test tool
    * Leif Madsen is working on getting the test tool working
stand-alone, and will work with Leo to document how to use the tool, and
help fill gaps for a first time user.
  * Patch review
    * https://gerrit.opnfv.org/gerrit/#/q/project:dovetail
    * We have a bandwidth issue for patch review right now - also, need
to be wary of the perception issue of patches and +2 reviews - in the
VIM operations test case, for example,
https://gerrit.opnfv.org/gerrit/#/c/22855/ there was a -1 from a
committer, indicating the need for changes and discussion
  * upstream and requirements
  * multiple sdn controllers
    * Taking an example: SFC - what would need to happen before adding
SFC to Dovetail?
      * Need adoption criteria - we don't want to adopt a new feature
the day it gets committed, target audience is product vendors.
      * Want to include features because the industry has a need that we
can fulfill
      * OPNFV is in some sense between upstreams and vendors at one
side, and customer demands on the other side
      * Multiple Northbound interfaces (networking-sfc, Tacker + ODL
SFC), multiple controller implementations (ODL, ONOS, OpenContrail), and
multiple Southbound options (OVS + NSH, VPP)
      * Wenjing: Not our role to push a single implementation or
interface. If there are competing implementations, the community needs
to work towards alignment, or if Dovetail wants to add the feature, then
have tests which can test either interface. Test tool implementation for
the test should be able to switch for different implementations
      * Dave: Do we end up with 6 tests for SFC for networking-sfc,
Tacker + ODL/ONOS/OpenContral + OVS/VPP? Earlier discussion suggested we
could use Tacker as a single abstraction/interface point for multiple
vSwitch, SDN controller options (Wenjing: does everyone agree on Tacker?
;-) )
      * Dave: If NSH capable VPP is released upstream, but the NSH patch
does not make it into OVS, is it appropriate to add a test that requires
NSH to Dovetail?
      * Wenjing: Probably not, because OVS is more widely adopted at
this point (see adoption criteria earlier)
      * Are there any cases of out-of-tree patches that have been widely
adopted but never made it upstream? Maybe RT_PREEMPT?
      * Upstream requirement had broad support in C&C, Dovetail, TSC -
so we should be comfortable with this as a requirement for compliance,
esp. since compliance is usually not cutting edge.
  * Dovetail test tool
    * Aiming for 2 config files, one for "these are the total tests
run", and one for "Dovetail certification tests" - we need to lower the
acceptance bar for tests added to the former
    * Need timely review of patches for tests in the "full test list"
config file
    * Request that the test tool output be human readable and provide
more complete output from the test runs than just "Pass/Fail"


/* Following topics not discussed */


dovetail output:   API, testcases, commpnents, features and tools
abstract: give an abstract introduction of project output for dovetail
certification
details?link of project output for dovetail certification( recommand the
new etherpad for the link. )

link to Dovetail: https://wiki.opnfv.org/projects/dovetail


-- 
Dave Neary - NFV/SDN Community Strategy
Open Source and Standards, Red Hat - http://community.redhat.com
Ph: +1-978-399-2182 / Cell: +1-978-799-3338
_______________________________________________
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss

Reply via email to