I would look into partitioning the tests into batches, so that keystone
tests are in earlier batches and fine grained tests are in later batches.

I would also pull all the fast tests into earlier batches.

If you collect the average test execution time for each test case, you can
sort tests based on increasing execution time and fail fast (since a
regression is a regression... though it depends... I find it can be handy
to know that there is only one failing test versus there being 500)

Similarly, any tests that failed last time, irrespective of how long they
take to run, should probably be run first.

Armed with per-test coverage data, per-test execution time, and the set of
failing tests from the last run, it should be possible to batch your
regression suite into say 4-8 batches and fail based on the results from a
batch.

The first batch should be something like the 15% fastest tests plus any
failing tests from last time plus make up the rest to give the highest
coverage with what remains.

I'd go with a 3:2 split for execution time to coverage... that's purely a
guess though, so e.g. your batch is 1000 tests, pick the top 600 tests
ranked by shortest execution time, sum their coverage and then pick the top
400 tests to fill the gaps in coverage (irrespective of execution time)

If you have a lot of historical data, you would identify brittle tests and
fragile tests...

* brittle tests are the tests that developers keep on breaking... they
should be executed in the early batches... a brittle test is one where the
test fails after a commit.

* fragile tests are the tests that break for no reason... usually
indicative of a badly written test (either not detecting the error
situation reliably or not detecting the non-error situation reliably)... a
fragile test is one where the test fails for no reason what so ever... e.g.
your regression suite runs every evening, but nobody made any commits
between friday evening and monday morning... the test passed on friday,
failed on saturday and passed again on sunday... with no commit activity...
that is a fragile test... there is something that needs investigation...
for instance it could be a subtle bug that only occurs on the fifth of
October when October has 5 Sundays (which was due to invalid test data
having assumed that there were 4 sundays in October which then lead to
uncovering a rake of incorrect date assumptions littered throughout the
code that the developer who wrote that test had made)... or it could be a
race condition in multi-threaded code.

Put the brittle tests in the first batch... they are a canary... put the
fragile tests in the last batch

On 12 August 2013 05:42, Yana K <yanak1...@gmail.com> wrote:

> Thanks Mark. This is the parallelization path. Besides, are there any
> other advices/best practice tips on how to "manage" such a fast-growing
> regression suite?
>
> Yana
>
>
>
> On Fri, Aug 9, 2013 at 6:34 PM, Mark Waite <markwa...@yahoo.com> wrote:
>
>> There are several levels at which you could run your JUnit tests in
>> parallel.  Most of them don't involve anything with Jenkins, other than
>> using Jenkins to start them executing.
>>
>> For example
>>
>>    - Use JUnit 4.7 to run the tests in parallel, see [1] and [2]
>>    - Use gradle to run your tests, see [3]
>>    - Use maven with JUnit 4.7 parallel support [4]
>>    - Use the ant <parallel> task to run multiple unit test processes
>>
>> [1]
>> http://stackoverflow.com/questions/5529087/how-to-make-junit-test-cases-execute-in-parallel
>> [2]
>> http://stackoverflow.com/questions/423627/running-junit-tests-in-parallel
>> [3]
>> http://stackoverflow.com/questions/7337886/run-junit-tests-in-parallel
>> [4]
>> http://maven.apache.org/surefire/maven-surefire-plugin/examples/junit.html
>>
>>
>> On Friday, August 9, 2013 5:19:40 PM UTC-6, Yana K wrote:
>>>
>>> Hi
>>> Could anyone please provide some idea on parallelizing a huge regression
>>> suite using Jenkins. We have over 25000 regression tests that run each day
>>> - so its taking a huge time to run them. We are using JUnit. And the tests
>>> are testing several web services.
>>> Thanks
>>> Yana
>>>
>>  --
>> You received this message because you are subscribed to a topic in the
>> Google Groups "Jenkins Users" group.
>> To unsubscribe from this topic, visit
>> https://groups.google.com/d/topic/jenkinsci-users/85VN4mLssqE/unsubscribe
>> .
>> To unsubscribe from this group and all its topics, send an email to
>> jenkinsci-users+unsubscr...@googlegroups.com.
>>
>> For more options, visit https://groups.google.com/groups/opt_out.
>>
>>
>>
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Jenkins Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to jenkinsci-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to