I took for granted that the original writer means a 29% reduction FOR THAT 
PARTICULAR DATASET, not overall CPU usage.  Maybe I was mistaken.

---
Bob Bridges, robhbrid...@gmail.com, cell 336 382-7313

/* If you suck at playing the trumpet, that's probably why. */

-----Original Message-----
From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> On Behalf Of 
Robert Prins
Sent: Sunday, January 19, 2025 10:07

<quote>
2 weeks ago I received the analysis data from a new client that wanted to 
reduce their CPU consumption and improve their performance. They sent me the 
statistical data from their z16 10 LPARS. Information about 89,000+ files. I 
analyzed their data and found 2,000+ files *that could be improved* and would 
save CPU when improved. *I pulled out 1 file to demonstrate a Proof of Concept 
(POC) for the client. I had the client run the POC and it showed a 29% 
reduction in CPU every time that file will be used. The 29% did not include 3 
other major adjustments that would save an addition 14% CPU and cut the I/O by 
75%.* This is just 1 file. The other files can save 3% to 52% of their CPU 
every time they are used in BATCH or ONLINE.
</quote>

I've been a programmer on IBM since 1985, and the above doesn't make any sense 
to me, how can changing just one file result in a 43% reduction in CPU usage?

I've only ever been using PL/I, and using that I did manage to make some 
improvements to code, including reducing the CPU usage of a CRC routine by an 
even larger amount, 99.7% (Yes, ninety-nine-point-seven percent), but that was 
because the old V2.3.0 PL/I Optimizing compiler was absolute shite at handling 
unaligned bit-strings, but WTH can you change about a file to get the above 
reduction in CPU?

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to