On Tue, Feb 01, 2011, Elazar Leibovich wrote about "Re: New Freecell Solver gcc-4.5.0 vs. LLVM+clang Benchmark": > Long story short, he claims there that modern computers are now highly > non-deterministic, he demonstrated 20% running time variation by the same > JVM running the same code.
I haven't listened to that talk, so I'm just talking now from my own experiences. It depends on what you're benchmarking, and why. Shlomi was benchmarking a single-process, CPU-intensive, program, and garbage collection was not involved. Everything in his program was perfectly deterministic. Indeed, the computer around him is *not* deterministic - other processes might randomly decide to do something - he might get mail in the middle, "updatedb" might start, he might be doing some interactive work at the same time, and some clock application might be updating the display every second, and something might suddely decide to access the disk. Or whatever. But if he runs his deterministic application 5 times and gets 5 different durations, each duration is composed of the deterministic run-time of the application plus a random delay caused by other things on the system. The *minimum* of these 5 durations is the one that had the minimum random delay, and is thus closest to the "true" run time. Presumably, if one runs the application on a machine which is as idle as humanly possible, you'd get something close to this minimum. It is true that when the application itself is non-deterministic, or when it closely interacts with other non-deterministic parts of the system (e.g., io-intensive, depends on network delays, etc.) averaging might make more sense. Nadav. -- Nadav Har'El | Tuesday, Feb 1 2011, 27 Shevat 5771 n...@math.technion.ac.il |----------------------------------------- Phone +972-523-790466, ICQ 13349191 |Windows-2000/Professional isn't. http://nadav.harel.org.il | _______________________________________________ Linux-il mailing list Linux-il@cs.huji.ac.il http://mailman.cs.huji.ac.il/mailman/listinfo/linux-il