Hi, btw, right now we have 33 outdated astute.yaml fixtures for noop rspec tests [0] - and this number is based on a single parameter of network_metadata['vips'] hash, actual amount of outdated fixtures could be bigger. So I've registered a new BP [1] to create a script that will generate up-to-date fixtures on demand and/or periodically.
Regards, Alex [0] http://paste.openstack.org/show/482508/ [1] https://blueprints.launchpad.net/fuel/+spec/deployment-dryrun-fixtures-generator On Mon, Dec 7, 2015 at 11:51 AM, Bogdan Dobrelya <bdobre...@mirantis.com> wrote: > On 02.12.2015 17:03, Bogdan Dobrelya wrote: > > On 01.12.2015 11:28, Aleksandr Didenko wrote: > >> Hi, > >> > >>> pregenerated catalogs for the Noop tests to become the very first > >>> committed state in the data regression process has to be put in the > >>> *separate repo* > >> > >> +1 to that, we can put this new repo into .fixtures.yml > >> > >>> note, we could as well move the tests/noop/astute.yaml/ there > >> > >> +1 here too, astute.yaml files are basically configuration fixtures, we > >> can put them into .fixtures.yml as well > > Folks, the patch to create the fuel-noop-fixtures [0] is in trouble. > I'm not sure I've answered Andreas's questions correct: > > - Would it be OK to keep Noop tests fixtures for fuel-library as a > separate Fuel-related repo but *not* as a part of the Fuel project? > > - Should we require the contribution license agreement for fixtures > which are only be used by tests? > > [0] https://review.openstack.org/252992 > > > > > I found the better -and easier for patch authors- way to use the data > > regression checks. Originally suggested workflow was: > > > > 1. > > "The check should be done for every modular component (aka deployment > > task). Data generated in the noop catalog run for all classes and > > defines of a given deployment task should be verified against its > > "acknowledged" (committed) state." > > > > This part remains the same with the only comment that the astute.yaml > > fixtures of deployment cases should be fetched from the > > fuel-noop-fixtures repo. And the committed state for generated catalogs > > should be > > stored there as well. > > > > 2. > > "And fail the test gate, if changes has been found, like new parameter > > with a defined value, removed a parameter, changed a parameter's value." > > > > This should be changed as following: > > - the data checks gate should be just a non voting helper for reviewers > > and patch authors. The only its task would be to show inducted data > > changes in a pretty and fast view to help accept/update/reject a patch > > on review. > > - the data checks gate job should fetch the committed data state from > > the fuel-noop-fixtures repo and run regressions check with the patch > > under review checked out on fuel-library repo. > > - the Noop tests gate should be changed to fetch the astute.yaml > > fixtures from the fuel-noop-fixtures repo in order to run noop tests as > > usual. > > > > 3. > > "In order to remove a regression, a patch author will have to add (and > > reviewers should acknowledge) detected changes in the committed state of > > the deployment data. This may be done manually, with a tool like [3] or > > by a pre-commit hook, or even at the CI side!" > > > > Instead, the patch authors should do nothing additionally. Once accepted > > with wf+1, the patch on reivew should be merged with a pre-commit zuul > > hook (is it possible?). The hook should just regenerate catalogs with > > the changes introduced by the patch and update the committed state of > > data in the fuel-noop-fixtures repo. After that, the patch may be safely > > merged to the fuel-library and everything will be up to date with the > > committed data state. > > > > 4. > > "The regression check should show the diff between committed state and a > > new state proposed in a patch. Changed state should be *reviewed* and > > accepted with a patch, to became a committed one. So the deployment data > > will evolve with *only* approved changes. And those changes would be > > very easy to be discovered for each patch under review process!" > > > > So this part would work even better now, with no additional actions > > required from the review process sides. > > > >> > >> Regards, > >> Alex > >> > >> > >> On Mon, Nov 30, 2015 at 1:03 PM, Bogdan Dobrelya < > bdobre...@mirantis.com > >> <mailto:bdobre...@mirantis.com>> wrote: > >> > >> On 20.11.2015 17:41, Bogdan Dobrelya wrote: > >> >> Hi, > >> >> > >> >> let me try to rephrase this a bit and Bogdan will correct me if > >> I'm wrong > >> >> or missing something. > >> >> > >> >> We have a set of top-scope manifests (called Fuel puppet tasks) > >> that we use > >> >> for OpenStack deployment. We execute those tasks with "puppet > >> apply". Each > >> >> task supposed to bring target system into some desired state, so > >> puppet > >> >> compiles a catalog and applies it. So basically, puppet catalog = > >> desired > >> >> system state. > >> >> > >> >> So we can compile* catalogs for all top-scope manifests in master > >> branch > >> >> and store those compiled* catalogs in fuel-library repo. Then for > >> each > >> >> proposed patch CI will compare new catalogs with stored ones and > >> print out > >> >> the difference if any. This will pretty much show what is going > to be > >> >> changed in system configuration by proposed patch. > >> >> > >> >> We were discussing such checks before several times, iirc, but we > >> did not > >> >> have right tools to implement such thing before. Well, now we do > >> :) I think > >> >> it could be quite useful even in non-voting mode. > >> >> > >> >> * By saying compiled catalogs I don't mean actual/real puppet > >> catalogs, I > >> >> mean sorted lists of all classes/resources with all parameters > >> that we find > >> >> during puppet-rspec tests in our noop test framework, something > like > >> >> standard puppet-rspec coverage. See example [0] for networks.pp > >> task [1]. > >> >> > >> >> Regards, > >> >> Alex > >> >> > >> >> [0] http://paste.openstack.org/show/477839/ > >> >> [1] > >> >> > >> > https://github.com/openstack/fuel-library/blob/master/deployment/puppet/osnailyfacter/modular/openstack-network/networks.pp > >> > > >> > Thank you, Alex. > >> > Yes, the composition layer is a top-scope manifests, known as a > Fuel > >> > library modular tasks [0]. > >> > > >> > The "deployment data checks", is nothing more than comparing the > >> > committed vs changed states of fixtures [1] of puppet catalogs for > >> known > >> > deployment paths under test with rspecs written for each modular > >> task [2]. > >> > > >> > And the *current status* is: > >> > - the script for data layer checks now implemented [3] > >> > - how-to is being documented here [4] > >> > - a fix to make catalogs compilation idempotent submitted [5] > >> > >> The status update: > >> - the issue [0] is the data regression checks blocker and is only > the > >> Noop tests specific. It has been reworked to not use custom facts > [1]. > >> New uuid will be still generated each time in the catalog, but the > >> augeas ensures it will be processed in idempotent way. Let's make > this > >> change [2] to the upstream puppet-nova as well please. > >> > >> [0] https://bugs.launchpad.net/fuel/+bug/1517915 > >> [1] https://review.openstack.org/251314 > >> [2] https://review.openstack.org/131710 > >> > >> - pregenerated catalogs for the Noop tests to become the very first > >> committed state in the data regression process has to be put in the > >> *separate repo*. Otherwise, the stackalytics would go mad as that > would > >> be a 600k-liner patch to an OpenStack project, which is the > Fuel-library > >> now :) > >> > >> So, I'm planning to use the separate repo for the templates. Note, > we > >> could as well move the tests/noop/astute.yaml/ there. Thoughts? > >> > >> > - and there is my WIP branch [6] with the initial committed state > of > >> > deploy data pre-generated. So, you can checkout, make any test > changes > >> > to manifests and run the data check (see the README [4]). It > works for > >> > me, there is no issues with idempotent re-checks of a clean > committed > >> > state or tests failing when unexpected. > >> > > >> > So the plan is to implement this noop tests extention as a > >> non-voting CI > >> > gate after I make an example workflow update for developers to the > >> > Fuel wiki. Thoughts? > >> > > >> > [0] > >> > > >> > https://github.com/openstack/fuel-library/blob/master/deployment/puppet/osnailyfacter/modular > >> > [1] > >> > > >> > https://github.com/openstack/fuel-library/tree/master/tests/noop/astute.yaml > >> > [2] > >> > https://github.com/openstack/fuel-library/tree/master/tests/noop/spec > >> > [3] https://review.openstack.org/240015 > >> > [4] > >> > > >> > https://github.com/openstack/fuel-library/blob/master/tests/noop/README.rst > >> > [5] https://review.openstack.org/247989 > >> > [6] > https://github.com/bogdando/fuel-library-1/commits/data_checks > >> > > >> > > >> > >> > >> -- > >> Best regards, > >> Bogdan Dobrelya, > >> Irc #bogdando > >> > >> > __________________________________________________________________________ > >> OpenStack Development Mailing List (not for usage questions) > >> Unsubscribe: > >> openstack-dev-requ...@lists.openstack.org?subject:unsubscribe > >> < > http://openstack-dev-requ...@lists.openstack.org?subject:unsubscribe> > >> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev > >> > >> > >> > >> > >> > __________________________________________________________________________ > >> OpenStack Development Mailing List (not for usage questions) > >> Unsubscribe: > openstack-dev-requ...@lists.openstack.org?subject:unsubscribe > >> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev > >> > > > > > > > -- > Best regards, > Bogdan Dobrelya, > Irc #bogdando > > __________________________________________________________________________ > OpenStack Development Mailing List (not for usage questions) > Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe > http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev >
__________________________________________________________________________ OpenStack Development Mailing List (not for usage questions) Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev