On Thu, Apr 03, 2014 at 04:39:24PM +0100, Martin Pitt wrote: > Hey Stéphane, > > Stéphane Graber [2014-04-02 16:19 -0400]: > > LXC upstream does full automated test runs on amd64, i386 and armhf for > > every commit to both the master and stable-1.0 branch. We also do test > > builds and test runs using clang, push daily builds to coverity for > > static code analysis and do automated cross-builds for Android. > > > > The upstream Jenkins server is at: https://jenkins.linuxcontainers.org > > I poked around and I can only find source builds for LXC itself, and > the daily download builds from lxc-templates. Where are the actual > test suite runs for CI?
The test runs are done by lxc-test-clang and lxc-test-gcc which first do a simple test build and then run the testsuite. Any test failure on any compiler/architecture combination will cause the test job to fail which in turn will cause the test dispatch job to fail which in turn will cause the main trigger job to fail. I'm working on a python script that parses the Jenkins json file and prints something slightly more readable like: stgraber@castiana:~$ python3 current-state Builds: - lxc-trigger-manual #44: Using git_branch=master, git_commit=HEAD, git_repository=stgraber/lxc on 2014-04-03_04-11-14 taking 698 minutes => ABORTED - lxc-dispatch-builds #50: On 2014-04-03_15-50-36 taking 0 minutes => FAILURE - lxc-dispatch-tests #106: On 2014-04-03_04-13-01 taking 108 minutes => SUCCESS - lxc-test-clang #89: On 2014-04-03_04-13-08 taking 108 minutes => SUCCESS - lxc-test-clang » amd64 #89: On 2014-04-03_04-13-08 taking 11 minutes => SUCCESS - lxc-test-clang » armhf #89: On 2014-04-03_04-13-08 taking 20 minutes => SUCCESS - lxc-test-clang » i386 #89: On 2014-04-03_04-13-08 taking 11 minutes => SUCCESS - lxc-test-gcc #101: On 2014-04-03_04-13-08 taking 106 minutes => SUCCESS - lxc-test-gcc » amd64 #101: On 2014-04-03_04-13-08 taking 9 minutes => SUCCESS - lxc-test-gcc » armhf #101: On 2014-04-03_04-13-08 taking 21 minutes => SUCCESS - lxc-test-gcc » i386 #101: On 2014-04-03_04-13-08 taking 9 minutes => SUCCESS - lxc-trigger-manual #43: Using git_branch=master, git_commit=HEAD, git_repository=stgraber/lxc on 2014-04-01_21-13-14 taking 35 minutes => SUCCESS - lxc-dispatch-builds #43: On 2014-04-01_21-38-47 taking 9 minutes => SUCCESS - lxc-build-android #33: On 2014-04-01_21-38-58 taking 7 minutes => SUCCESS - lxc-build-coverity #33: On 2014-04-01_21-38-57 taking 9 minutes => SUCCESS - lxc-build-debian-source #16: On 2014-04-01_21-38-58 taking 5 minutes => SUCCESS - lxc-build-debian-source » unstable,amd64 #16: On 2014-04-01_21-38-58 taking 2 minutes => SUCCESS - lxc-build-ubuntu-source #16: On 2014-04-01_21-38-58 taking 7 minutes => SUCCESS - lxc-build-ubuntu-source » precise #7: On 2014-03-31_22-09-53 taking 11 minutes => SUCCESS - lxc-build-ubuntu-source » quantal #7: On 2014-03-31_22-09-53 taking 4 minutes => SUCCESS - lxc-build-ubuntu-source » saucy #7: On 2014-03-31_22-09-53 taking 4 minutes => SUCCESS - lxc-build-ubuntu-source » trusty #7: On 2014-03-31_22-09-53 taking 11 minutes => SUCCESS - lxc-build-ubuntu-source » precise,amd64 #16: On 2014-04-01_21-38-58 taking 2 minutes => SUCCESS - lxc-build-ubuntu-source » quantal,amd64 #16: On 2014-04-01_21-38-58 taking 2 minutes => SUCCESS - lxc-build-ubuntu-source » saucy,amd64 #16: On 2014-04-01_21-38-58 taking 3 minutes => SUCCESS - lxc-build-ubuntu-source » trusty,amd64 #16: On 2014-04-01_21-38-58 taking 7 minutes => SUCCESS - lxc-dispatch-tests #98: On 2014-04-01_21-15-01 taking 23 minutes => SUCCESS - lxc-test-clang #81: On 2014-04-01_21-15-09 taking 19 minutes => SUCCESS - lxc-test-clang » amd64 #81: On 2014-04-01_21-15-09 taking 9 minutes => SUCCESS - lxc-test-clang » armhf #81: On 2014-04-01_21-15-09 taking 19 minutes => SUCCESS - lxc-test-clang » i386 #81: On 2014-04-01_21-15-09 taking 7 minutes => SUCCESS - lxc-test-gcc #93: On 2014-04-01_21-15-09 taking 23 minutes => SUCCESS - lxc-test-gcc » amd64 #93: On 2014-04-01_21-15-09 taking 9 minutes => SUCCESS - lxc-test-gcc » armhf #93: On 2014-04-01_21-15-09 taking 23 minutes => SUCCESS - lxc-test-gcc » i386 #93: On 2014-04-01_21-15-09 taking 8 minutes => SUCCESS > I had a look at the test suite[1]. It covers quite a lot of the API, > but only the most common cases and often only ensures that the calls > don't fail: E. g. it doesn't check that a call like add_device_node() > actually does what it promises. But while it doesn't cover many corner > cases it certainly covers all the major workflows and functionality, > especially the integration tests like "lxc-test-ubuntu". These also > run in the autopkgtest which we intend to keep running for trusty > after the release. Right and we actually use the exact same test script as is run in adt for the upstream test runs. > I did a spot check of about 10 recent commits, and none of them were > accompanied by test updates (in particular tricky stuff like > https://github.com/lxc/lxc/commit/0d9acb9 which applies to relatively > subtle changes which can easily regress without noticing quickly). We certainly could do a better job at getting our contributors to add regression tests... This particular commit is slightly tricky to test in a reliable way as part of an automated testsuite though because the testsuite will typically run with stdout and stderr pointing to a file and stdin being closed or similar, which unfortunately tends to confuse init systems that expect a tty/pts device instead. That's why all of our tests currently use backgrounded start() and so we can't easily cover that particular commit as it specifically only applies to foreground start. > Do you have plans to also cover testing of the templates? Bugs like > https://bugs.debian.org/743425 are a bit of a bummer to have if they > hit a stable release (apparently that's not what happened in that > case, though, it's just something which I ran into recently). We'd certainly welcome test cases from our template maintainers, however we have the problem that not all templates work on all distros and their arguments aren't standardized either (though we've improved that a bit slightly recently with -r and -a being almost universal now). > > > Seeing the very large amount of SRUs we've been doing in the past > > few releases, it's my belief that an MRE will save a lot of time to > > Serge and I as well as the SRU team by simplifying the amount of > > documentation required for our updates. > > I can't say that the existing test coverage would be sufficiently > reliable to use it as fully automated SRU testing, but for each new > microrelease we should run a test plan for the most important > scenarios. But with that I have no general objection, especially as > the changes you intend to do to the 1.0 branch are bug fix only > (contrary to other MREs). I usually do extensive manual testing before tagging a point release upstream, especially of any particularly tricky fix (rshared and unpriv for 1.0.0 and cgmanager for 1.0.1). I'll make sure to document that in the SRU tracking bug so we can make sure some similar tests are done again against the -proposed package before release. > So I don't see verifying individual bugs for SRUs as the most > important issue here, but regression testing. When doing that, and > limiting this MRE to bug fixes only, I'm fine with the MRE. Thanks. The main goal for us is really to avoid having to open dozen of LP bugs for every point release, we'll typically link quite a few with every upload as most of our users are on Ubuntu so we usually get LP bugs for the issues we fix upstream but not having to create dummy bugs for the rest of the changes will be a nice improvement. > > Thanks, > > Martin > > [1] Small chuckle: createtest.c creates a busybox container and says > "trusty container". Totally not important for this mail, of course :) Oh, nice catch, will need to fix that message :) Most of our tests used to build a full Ubuntu container which was taking an enormous amount of time for no good reason, we recently changed that by using busybox everywhere possible, but clearly we missed that message :) > > -- > Martin Pitt | http://www.piware.de > Ubuntu Developer (www.ubuntu.com) | Debian Developer (www.debian.org) > -- > technical-board mailing list > technical-board@lists.ubuntu.com > https://lists.ubuntu.com/mailman/listinfo/technical-board -- Stéphane Graber Ubuntu developer http://www.ubuntu.com
signature.asc
Description: Digital signature
-- technical-board mailing list technical-board@lists.ubuntu.com https://lists.ubuntu.com/mailman/listinfo/technical-board