On 14/03/13 22:41, Dale wrote: > Grant Edwards wrote: >> On 2013-03-14, Dale <rdalek1...@gmail.com> wrote: >> >>> I was wondering. Has anyone ever seen where a test as been done to >>> compare the speed of Gentoo with other distros? Maybe Gentoo compared >>> to Redhat, Mandrake, Ubuntu and such? >> I just did a test, and they're all the same. >> >> CDs/DVDS of various distros dropped from a height of 1m all hit the >> floor simultaneously [there are random variations due to aerodynamic >> instability of the disk shape, but it's the same for all distros]. If >> launched horizontally with spin to provide attitude stability (thrown >> like a frisbee), they all fly the same. >> >> The point being, you're going to have to define "speed". >> >> Does speed refer to >> >> Installation time? >> >> Boot time? >> >> Linpack? >> >> Dhrystone? >> >> Whetstone? >> >> Time for me to figure out how to fix a configuration problem? >> >> Time to do to an update on a machine that's been unplugged for a year? >> >> Time to to produce a packaged version of some random C program that >> comes with a Makefile that uses autotools? >> >> Time for a reported bug to get fixed? >> > > > OK. It appears not very many can figure out what I asked for. So, let > me spell it out for those who are challenged. LOL ;-) Read some > humor into that OK. > > Install a OS. Run tests on a set of programs and record the time it > takes to complete a certain task. More tasks the better. > > Then install another OS on the same hardware. Run tests on a set of > programs and record the time it takes to complete a certain task. More > tasks the better. > > The object of this is, does Gentoo with the customization it allows run > faster than some binary install that does NOT allow those controls? In > other words, can a Gentoo based install perform more efficiently than a > binary based install like Redhat, Ubuntu or some other distro? > > I am NOT concerned about compile times or the install itself. > > Does that put the dots closer together for the challenged ones? ROFL > > Dale > > :-) :-) > The point of the challenged ones was that while we can take measurements like these, it's rather meaningless to do so. The result will be different for every single person out there depending on their configuration, USE, CFLAGS and who knows what else.
I can compile a package with support for 3 different DEs, few WMs, oss and alsa and about a billion things I will never use. Does this make for a more or less of a meaningful test than doing the same test with no flags what so ever? There is no correct answer as it varies per user basis. The most meaningful measurements that we can probably take would be between different USE flags configurations. Maybe we can say that package ‘foo’ with certain USE and CFLAGS runs in less average time than the same package on a distro Bar. In my opinion, it would be far more meaningful to measure the effect of different USE flags on the same package, *in relative time* on the same system. This would give us more idea about the impact of each flag as opposed to a very limited view of ‘package foo with certain specific USE flags runs 10ms faster than the same package on the same hardware on a binary distribution’. If you still want such measurements and you want them to be somewhat meaningful to you, it is you who will have to take them. Unless there are some gross inconsistencies in run times on different distributions, we have no use for such measurement. Everyone understood what you asked for. It's _you_ that misunderstood their explanation for why it's meaningless to ask such a question in the first place. -- Mateusz K.