Steven Bosscher wrote:

On 4/12/07, Vladimir Makarov <[EMAIL PROTECTED]> wrote:
SPECFp2000 compilation time (user time):
machine  mainline  branch change
---------------------------------
x86_64   104.8s    117.7s  +12.3%
ppc64    312.3s    367.8s  +17.8%
ia64     377.6s    502.9s  +33.2%

Hi Vlad,

Thanks for testing this. Do you also have per benchmark compilation
times, perhaps?

Not really. I don't do that because runtest startup is about 0.4s (on ppc64) and a few fp tests are compiled for 1.5s. If you are interesting in analyzing a reason for slowdown, I'd recommend to look at fma3d. For ppc64 It is compiled for 2m9.6s on the branch and 1m43.7s on the mainline last merge point. For ia64 it is compiled for 3m20.8s on the branch and for 2m9.6s on the mainline.
I think that bigger compilation time is partially because of bigger code 
generated on the branch.  It is very hard to analyze the code size 
difference on x86_64 (because the same insns with different registers 
e.g. xmm0 and xmm8 or dx and r8 have different length) or on ia64 
(because of different number of nops used to fill bundles).  PPC64 is 
the best target for this because you need to check only number of insns.
Here is the code size difference

2nd column - text segment size on the branch
3rd column - text segment size on mainline on the merge point

PPC64
----------------CFP2000-----------------
-1.553%          29882          29418 168.wupwise
-1.904%          10925          10717 171.swim
-1.934%          17374          17038 172.mgrid
-1.672%          50717          49869 173.applu
-0.994%         587269         581429 177.mesa
-1.539%         230755         227203 178.galgel
-1.128%          18440          18232 179.art
-0.309%          20698          20634 183.equake
-1.094%          68741          67989 187.facerec
-0.780%         135440         134384 188.ammp
-1.024%          45301          44837 189.lucas
-0.425%        1005864        1001592 191.fma3d
-0.316%         870011         867259 200.sixtrack
-1.171%         139383         137751 301.apsi



Reply via email to