But,...
While mod'ing onto an existing file can work, for a production process this can create a nightmare for restart after a process failure if you don't plan for process restart up front -- highly recommended to either first make a backup for recovery before each append process or copy the old file to a new and append the additional records to the new file as part of that copy process.

My personal preference for a report-retention requirement like this is to come up with some technique that allows the use of variable dataset names rather than trying to work around the limitations of GDGs. Or, if all the reports have the same DS attributes and are relatively small, it may be advantageous to use a PDS or PDS/E with each report as a separate member, using dynamic allocation of a variable member name (possibly variable DS name also) from within the report writing program. With variable names, you will most likely also need your own techniques for purging reports that are no longer needed.

It is not necessary to roll your own SVC99 interface to do dynamic allocation: just use facilities accessible from REXX or other scripting languages, or use a free subroutine like CBT's DYNALLOC from compiled languages.
  Joel C Ewing

On 05/10/2012 07:41 PM, Cris Hernandez #9 wrote:
does it have to be a gdg?

what about mod'ing onto the file then emptying it out when it's processed?

back it up with a gdg with that batch job at regular intervals suggestion.




________________________________
  From: "Donnelly, John"<[email protected]>
To: [email protected]
Sent: Thursday, May 10, 2012 5:36 PM
Subject: Re: ### of GDG Entries

Thankyou all

John Donnelly
Texas Instruments SVA
2900 Semiconductor Drive
Santa Clara, CA 95051
408-721-5640
408-470-8364 Cell
[email protected]


-----Original Message-----
From: IBM Mainframe Discussion List [mailto:[email protected]] On Behalf Of 
Linda
Sent: Thursday, May 10, 2012 2:28 PM
To: [email protected]
Subject: Re: ### of GDG Entries

Hi John,

I have had I similar GDG 'thing'.  It would be helpful to know more details...

In my case, we were receiving a widely varying number of reports that we were 
to backup and print for our customer. We scheduled the print job to run every 8 
hours. The operations staff fit the actual printing in to meet customer needs.

Would that work for you?

Another option might include merging the generations to another dataset.
HTH,

Linda

Sent from my iPhone

On May 10, 2012, at 1:12 PM, "Donnelly, John"<[email protected]>  wrote:

We have a business application that creates literally 100s of GDGs a day; 
please don't ask.
Is there any way to create or pretend to create a GDG base greater than 255...

John Donnelly
Texas Instruments SVA
2900 Semiconductor Drive
Santa Clara, CA 95051
408-721-5640
408-470-8364 Cell
[email protected]

...
--
Joel C. Ewing,    Bentonville, AR       [email protected] 

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to