> On Sep 25, 2015, at 3:25 AM, Raja Pullela <raja.pull...@citrix.com> wrote:
> 
> Please see inline comments..
> 
>> On Sep 25, 2015, at 1:47 AM, Sebastien Goasguen <run...@gmail.com> wrote:
>> 
>> 
>>> On Sep 24, 2015, at 9:08 PM, Raja Pullela <raja.pull...@citrix.com> wrote:
>>> 
>>> BVT report 09/23
>>> 
>>> simulator basic - 30% , earlier runs had 100% pass rate, failures need to 
>>> be analyzed
>> 
>> What’s earlier ? yesterday, last week, six months ago ?
> Earlier - yesterday's runs

Ok so it is much wore than yesterday. I look forward to get your analysis .

>> 
>>> Simulator adv - 50%, earlier runs had 100% pass rate, failures need to be 
>>> analyzed
>> 
>> Same here, what’s earlier ? 
> Earlier - yesterday's runs
>> 
>>> XS basic - 95.3% 
>>> XS Adv - 93.6% 
>>> XS eip - 86.7%
>> 
>> Is this good enough from your view for release, or do you want 100% ?
> We should get 100%
>> 
>>> KVM basic - 89.4%
>>> KVM Adv - Deployment issue, need to check and will update 
>>> KVM eip - 95.7%
>> 
>> Good so it’s running now.
> Yes, after the agent fixes this is back up

Was this fix identified as a blocker, was there a PR for it and has it been 
merged ?


The reason I am asking is that it is difficult to understand these numbers out 
of context.
Seeing the results given some notion of the success, but we don’t really no 
what tests are run, on what exact setup.

Also you mentioned regression tests..what are those ? we don’t run regression 
tests upstream, so I suppose these are internal Citrix tests...


>> 
>>> VMware Adv - deployment issue, need to check and update
>> 

Reply via email to