100% agree, its hard to handle these 3rd party type of drivers but I think we 
need to find out a way that will test it in a way that doesn't require having 
said 3rd party gear directly available.

Could it be possible to have CI gating be blocked/tested by individual 
subfolders of cinder.

For example when the solidfire driver is modified, this would cause a 'trigger' 
to say solidfire (via some API) that solidfire can respond with back saying 
said commit works.

Not sure if that’s feasible, but it does seem to be a similar situation in 
nova, neturon, cinder as more and more 3rd party 'driver-like' code appears.

From: John Griffith 
<john.griff...@solidfire.com<mailto:john.griff...@solidfire.com>>
Reply-To: OpenStack Development Mailing List 
<openstack-dev@lists.openstack.org<mailto:openstack-dev@lists.openstack.org>>
Date: Thursday, July 25, 2013 5:44 PM
To: OpenStack Development Mailing List 
<openstack-dev@lists.openstack.org<mailto:openstack-dev@lists.openstack.org>>
Subject: [openstack-dev] [OpenStack][Cinder] Driver qualification

Hey Everyone,

Something I've been kicking around for quite a while now but never really been 
able to get around to is the idea of requiring that drivers in Cinder run a 
qualification test and submit results prior to introduction in to Cinder.

To elaborate a bit, the idea could start as something really simple like the 
following:
1. We'd add a functional_qual option/script to devstack

2. Driver maintainer runs this script to setup devstack and configure it to use 
their backend device on their own system.

3. Script does the usual devstack install/configure and runs the volume pieces 
of the Tempest gate tests.

4. Grabs output and checksums of the directories in the devstack and /opt/stack 
directories, bundles up the results for submission

5. Maintainer submits results

So why would we do this you ask?  Cinder is pretty heavy on the third party 
driver plugin model which is fantastic.  On the other hand while there are a 
lot of folks who do great reviews that catch things like syntax or logic errors 
in the code, and unit tests do a reasonable job of exercising the code it's 
difficult for folks to truly verify these devices all work.

I think it would be a very useful tool for initial introduction of a new driver 
and even perhaps some sort of check that's run and submitted again prior to 
milestone releases.

This would also drive some more activity and contribution in to Tempest with 
respect to getting folks like myself motivated to contribute more tests 
(particularly in terms of new functionality) in to Tempest.

I'd be interested to hear if folks have any interest or strong opinions on this 
(positive or otherwise).  I know that some vendors like RedHat have this sort 
of thing in place for certifications, and to be honest that observation is 
something that caused me to start thinking about this again.

There are a lot of gaps here regarding how the submission process would look, 
but we could start relatively simple and grow from there if it's valuable or 
just abandon the idea if it proves to be unpopular and a waste of time.

Anyway, I'd love to get feed-back from folks and see what they think.

Thanks,
John

_______________________________________________
OpenStack-dev mailing list
OpenStack-dev@lists.openstack.org
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev

Reply via email to