Hi, all

I have checked pvt reports after r1400866, and it is becoming stable except
for 3 issues[1][2][3] refered by linyi. Maybe new code delivery has fix
this unstable problem. I suggest that we can wait for pvt results of
following revisions and check their status. If this problem still exists,
then we can go on with it.


On Tue, Nov 20, 2012 at 2:58 PM, Linyi Li <lilinyi921...@gmail.com> wrote:

> Hi Herbert,
>
> Assuming you mean the officially released AOO341 as "milestone release
> build" and with "trunk build" you mean the snapshot builds [1] or the
> nightly builds [2] of trunk,
>
>
> Trunk build, I mean buildbot build:
> http://ci.apache.org/projects/openoffice/install/
> Milestone release build, I mean r1400866:
>
> https://cwiki.apache.org/confluence/display/OOOUSERS/Development+Snapshot+Builds
> 341 release build, I mean r1372282:
> http://www.openoffice.org/download/index.html
>
>  then the biggest differences are probably:
>
> > - the nightly builds have no binfilter component
> > - most dev-snapshot and all nightly builds don't have system-integration
> > - new features / bugfixes / other changes in trunk
> >
> > [1] http://s.apache.org/aoo_**devsnaps <http://s.apache.org/aoo_devsnaps
> >
> > [2] http://ci.apache.org/projects/**openoffice<
> http://ci.apache.org/projects/openoffice>
> >
> >
> From the above differences,  do you think there will be any performance
> difference between milestone release build and trunk build?
>
>
>
> > As for the numbers itself I'm not sure how to interpret them, e.g. the
> > dashboard indicates that the PVT test named "savePlainXLS" has improved
> by
> > 28% [3] on Linux32 between AOO341 and r1407366, whereas the individual
> runs
> > indicate a performance improvement from 104sec [4] -to 96sec [5], which
> are
> > about 7.7%. There are similar observations for other individual test
> > results.
> >
> > [3]
> http://people.apache.org/~**liuzhe/testdashboard/#pvt_gui_**Benchmark<
> http://people.apache.org/~liuzhe/testdashboard/#pvt_gui_Benchmark>
> > [4] http://people.apache.org/~**liuzhe/test/341m1%28Build:**
> > 9593%29@2012-08-13@1372282@**pvt.gui.Benchmark@Ubuntu-12.**
> > 04-i386@aoopvt2/result.html<
> http://people.apache.org/~liuzhe/test/341m1%28Build:9593%29@2012-08-13@1372...@pvt.gui.Benchmark@Ubuntu-12.04-i386@aoopvt2/result.html
> >
> > [5] http://people.apache.org/~**liuzhe/test/350m1%28Build:**
> > 9611%29@2012-11-10@1407366@**pvt.gui.Benchmark@Ubuntu-12.**
> > 04-i386@aoopvt2/result.html<
> http://people.apache.org/~liuzhe/test/350m1%28Build:9611%29@2012-11-10@1407...@pvt.gui.Benchmark@Ubuntu-12.04-i386@aoopvt2/result.html
> >
> >
> >
> The time in [3] is the average time of each test scenario, the unit is
> millisecond. I tested each scenario 8 times to get the average time in case
> one time is unstable. [4][5] is the total time of one test method, namely
> from starting OO to closing OO, 8 times of each scenario, the unit is
> second. The total time is not the accurate time of each test scenario.
>
>
> > Another question that worries me is whether the tests are run with the
> > test framework from the same revision they are testing? Or if they are
> all
> > run by the same test framework then the question is: which revision?
> >
> > Herbert
> >
> >
> All the tests are run with the same test framework. The revision number may
> be r1372282.
>
> --
> Best wishes.
> Linyi Li
>

Reply via email to