On 27/05/15 19:06, Jude Nelson wrote:
Hi Anto,
On Wed, May 27, 2015 at 6:19 AM, Anto <arya...@chello.at
<mailto:arya...@chello.at>> wrote:
On 25/05/15 18:29, Jude Nelson wrote:
Hey everyone,
I have the latest news for vdev:
.
<snip>
.
Thoughts and feedback to the above welcome :)
-Jude
Thanks a lot Jude for all your efforts.
Since you put a big fat warning "**CURRENTLY BROKEN. DO NOT
ATTEMPT. WE ARE MIGRATING TO .DEB PACKAGES.**" on
https://github.com/jcnelson/vdev/blob/master/how-to-test.md#appendix-b-booting-with-vdevd
about 13 days ago, I have been waiting for your update on the deb
build package script so I can continue testing. How far are we on
that? Please do not assume that I am pushing you.
If that would be quite far due to higher priorities on other parts
of vdev development, how about if you included test cases for any
parts of vdev that still need testing?
For instance, a simple list like below would help novice testers
like me (and hopefully help speeding up vdev development as it
looks getting more complex):
1. Test Case A Title
Test Unit: vdev function X
Expected Result: <list of expected output or expected logs
when the test pass>
Test Result: Pass or Fail
Test Log: <logs either when the test pass or fail>
2. Test Case B Title
Test Unit: vdev function B
Expected Result: <list of expected output or expected logs
when the test pass>
Test Result: Pass or Fail
Test Log: <logs either when the test pass or fail>
And so on.
I think that will also help you shifting your priorities in
putting your efforts on each parts of vdev development.
Cheers,
Anto
What you're asking for is a black-box test suite. In most software
systems, this is highly desirable, and is usually added into the CI so
the CI can verify that all tests pass before generating a package.
Unfortunately, systems like udev and vdev are exceptions--their
"expected results" are tightly-coupled to the underlying hardware.
There's no way to categorically describe the expected results without
also describing all the relevant aspects of the underlying hardware as
well, and there are as many different hardware configurations as there
are users (so each user has a different expected result). The best I
can do is verify that vdev produces a /dev that has the same files
that udev would have created on the same hardware.
I'm still figuring out the packaging. My goal is to add Makefile
targets to use fpm to generate packages (for testing and cross-distro
compatibility), and then worry about adding the Devuan-specific
control files to get vdev hooked into the CI system (ideally, I'd find
a way to get fpm to generate those for me too).
-Jude
Hello Jude,
Thanks for your explanation that the type tests that I mentioned has
actually a name in software engineering. I know quite little about this
field :)
I just thought it is logical to some how use structural approach so that
would be easier for you (as the only developer) to distribute your
efforts. And I think at this stage, you can decide a single hardware and
test setup to focus on, so that you could avoid over predicting what the
users might expect which could be millions of combinations.
Well... I am sure you know better about that. I am just looking for
possibilities to be able to help you more.
Cheers,
Anto
_______________________________________________
Dng mailing list
Dng@lists.dyne.org
https://mailinglists.dyne.org/cgi-bin/mailman/listinfo/dng