On Fri, Feb 22, 2013 at 1:24 PM, Sowmya Krishnan
<sowmya.krish...@citrix.com> wrote:
> Hi,
>
> I've posted a test plan for tracking the performance numbers for the set of 
> List APIs which were optimized as mentioned in 
> https://issues.apache.org/jira/browse/CLOUDSTACK-527
> Test plan is here: 
> https://cwiki.apache.org/confluence/display/CLOUDSTACK/List+API+Performance+Test+Plan
>
> Please take a look and post comments if any.
>


Thanks for writing this up, I have a couple of questions for you.

I understand that you are running these tests and recording
performance, but it seems like you are measuring time. Is this time
from query leaving the client to answer? Is the client on the
management server or not?

I assume you are going to use the simulator, and not just have a
populated DB? (If that isn't the case, perhaps you can share the db
dump.)

Are you going to take a baseline from 4.0.{0,1}?

Can this test be written up as a script and generate these statistics
as we get near a release to ensure we don't regress?

Are we assuming there will be no slow/long running queries? If there
are, it might be interesting to see what those are and if there are
database issues we can further work on?

What is 'failure' of this test? (slower than 4.0.x?, slower than
n-percent-faster than 4.0.x?)

--David

Reply via email to