Erik Cederstrand wrote:
Hi
I'd like to send a small update on my progress on the Performance
Tracker project.
I now have a small setup of a server and a slave chugging along,
currently collecting data. I'm following CURRENT and collecting results
from super-smack and unixbench.
The projec
Kris Kennaway wrote:
This is coming along very nicely indeed!
One suggestion I have is that as more metrics are added it becomes
important for an "at a glance" overview of changes so we can monitor for
performance improvements and regressions among many workloads.
>
One way to do this would
Erik Cederstrand wrote:
Kris Kennaway wrote:
This is coming along very nicely indeed!
One suggestion I have is that as more metrics are added it becomes
important for an "at a glance" overview of changes so we can monitor
for performance improvements and regressions among many workloads.
>
On Wed, 23 Jan 2008, Erik Cederstrand wrote:
I'd like to send a small update on my progress on the Performance Tracker
project.
I now have a small setup of a server and a slave chugging along, currently
collecting data. I'm following CURRENT and collecting results from
super-smack and unixb
On Wed, 23 Jan 2008, Erik Cederstrand wrote:
One way to do this would be a matrix of each metric with its change
compared to recent samples. e.g. you could do a student's T comparison of
today's numbers with those from yesterday, or from a week ago, and
colour-code those that show a significa
On Wed, Jan 23, 2008 at 05:48:23AM +0100, Erik Cederstrand wrote:
> Hi
>
> I'd like to send a small update on my progress on the Performance Tracker
> project.
>
> I now have a small setup of a server and a slave chugging along, currently
> collecting data. I'm following CURRENT and collecting
Robert Watson wrote:
This looks really exciting!
Do you plan to add a way so that people can submit performance data?
I.e., if I set up my own test box and want to submit a result once a
week for that, will there be a way for me to get set up with a
username/password, submit configuration i
On Wed, 23 Jan 2008, Erik Cederstrand wrote:
Robert Watson wrote:
This looks really exciting!
Do you plan to add a way so that people can submit performance data?
I.e., if I set up my own test box and want to submit a result once a week
for that, will there be a way for me to get set up wit
Robert Watson wrote:
I think it's best if participating machines supply data regularly for
an extended period of time. Single or infrequent data points for a
specific configuration don't make much sense. We need to compare
apples to apples.
Yes -- I was mostly thinking about backdating in or
Robert Watson wrote:
On Wed, 23 Jan 2008, Erik Cederstrand wrote:
I agree that there's a need for an overview and some sort of
notification. I've been collecting historical data to get a baseline
for the statistics and I'll try to see what I can do over the next weeks.
A thumbnail page of gr
Kris Kennaway wrote:
Robert Watson wrote:
Yes -- I was mostly thinking about backdating in order to play
"catchup" when a new benchmark is introduced.
One thing I am looking at is how to best create a library of world
tarballs that can be used to populate a nfsroot (or hybrid of periodic
t
Kris Kennaway wrote:
The project still needs some work, but there's a temporary web
interface to the data here: http://littlebit.dk:5000/plot/. Apart from
the plotting it's possible to compare two dates and see the files that
have changed. Error bars are 3*standard deviation, for the points wi
Kris Kennaway wrote:
P.S. If I understand correctly, the float test shows a regression?
The metric is calculations/second, so higher = better?
The documentation on Unixbench is scarce, but I would think so.
Interesting. Some candidate changes from 2007-10-02:
Modified files:
contrib
13 matches
Mail list logo