OPT(0) burns that much more CPU? Is this on all compiles or many or just a few of them? If compiles are that bad using OPT(0), what will an OPT(2) do? We're just starting our install of 6.3, going from 4.2 and this sounds like something we need to be aware of.
Thanks, Rex -----Original Message----- From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> On Behalf Of ste...@copper.net Sent: Wednesday, May 20, 2020 5:08 PM To: IBM-MAIN@LISTSERV.UA.EDU Subject: [External] Re: What crashing COBOL systems reveal about applications maintenance -- GCN We setup for OPT(1) because IBM said that was the thing to do initially. We’ve only recently been told to go OPT(2). We’ve also run into an interesting issue of COBOL 6.2 compiles using OPT(0) taking > 10x CPU of same compile with 4.2. I don’t recall being told that we would see that level of CPU burn for planning for capacity for migrating to 6.2. Regards Steve Thompson --- frank.swarbr...@outlook.com wrote: From: Frank Swarbrick <frank.swarbr...@outlook.com> To: IBM-MAIN@LISTSERV.UA.EDU Subject: Re: [IBM-MAIN] What crashing COBOL systems reveal about applications maintenance -- GCN Date: Wed, 20 May 2020 21:28:33 +0000 We use OPT(1). Probably for no good reason. (And it was my decision, meaning its easy enough to change!) ________________________________ From: IBM Mainframe Discussion List <IBM-MAIN@LISTSERV.UA.EDU> on behalf of Tom Ross <tmr...@stlvm20.vnet.ibm.com> Sent: Wednesday, May 20, 2020 3:19 PM To: IBM-MAIN@LISTSERV.UA.EDU <IBM-MAIN@LISTSERV.UA.EDU> Subject: Re: What crashing COBOL systems reveal about applications maintenance -- GCN >Suppose that they took a group of programmers and got the production >online= programs to all compile with COBOL 6.2 and OPT(1). Would they >see a signif= icant reduction in MSUs? Assuming they are running on z14s >minimally? I sure hope no one is using OPT(1) with 3rd generation COBOL! IBM expects all users to compile with OPT(2) for production performance. I am honestly not sure why we shipped OPT(1). Users should use OPT(0) if they want more straight-forward debugging (no optimizations) and then after unit test compile with OPT(2) for performance, and and never use OPT(1). Alternatively, they could compile with OPT(2) for debugging and get used to odd things like statements getting moved or deleted while debugging. Cheers, TomR >> COBOL is the Language of the Future! << ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN The information contained in this message is confidential, protected from disclosure and may be legally privileged. If the reader of this message is not the intended recipient or an employee or agent responsible for delivering this message to the intended recipient, you are hereby notified that any disclosure, distribution, copying, or any action taken or action omitted in reliance on it, is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by replying to this message and destroy the material in its entirety, whether in electronic or hard copy format. Thank you. ---------------------------------------------------------------------- For IBM-MAIN subscribe / signoff / archive access instructions, send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN