Charles,

In theory, you divide the rated SU/second by the number of processors giving 
SUs/processor/second, adjusting for "MP effect" overhead.  Similarly, you could 
use MIPS/processor such that: 

273.8 (2064-2C3) divided by 426.1 (2094-722) equals 0.643.

0.146 seconds times 0.643 equals 0.0939 seconds

Subtle factors render the ratio less than exact, especially with very small 
values, but your tests should prove to be in the ballpark.  Test by averaging 
several runs and let us know how it turns out.

db

-----Original Message-----
From: IBM Mainframe Discussion List [mailto:IBM-MAIN@LISTSERV.UA.EDU] On Behalf 
Of Charles Mills
Sent: Tuesday, July 17, 2012 9:53 AM
To: IBM-MAIN@LISTSERV.UA.EDU
Subject: Help with elementary CPU speed question

I have gotten dragged into a CPU performance question; a field I know little 
about.

 

I run a test on a 2094-722. It is rated at 19778 SU/Second. The job consumes
.146 CPU seconds total.

 

I run the same job on a 2064-2C3. It is rated at 13378 SU/Second. All other 
things being roughly equal, should I expect that the job will consume 1.48
(19778/13378) times as much CPU time, or .216 CPU seconds?

 

Is my logic right, or am I off somewhere? I'm not worried about a millisecond 
or two; just the broad strokes.

 

Thanks,

Charles 


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send email to 
lists...@listserv.ua.edu with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to