At 15:55 -0500 on 02/23/2012, Tony Harminc wrote about Re: JES2
OFFLOAD - requesting help, ideas, etc.:
The latter is what would result from using my catalogued tape dataset
suggestion, if it works at all. There would be multiple, unrelated
offload datasets that just happen to be on the same physical volume.
Presumably you would have to invent a naming scheme so that you would
unload to (and potentially reload from) the right places. I also
gather that JES2 supports a max of only 8 offload devices, but it
looks as though you can dynamically change the dsname, so perhaps you
could stick with one device and update the name for each day.
Make the dataset name a GDG and make use it on one of the devices
(the one you will use to read the tape back in) - Code as DSN(0).
Output to another tape/file with DSN OFFLOAD(+!) and then do a copy
of the DSN(0) concatenated with OFFLOAD(0) (or OFFLOAD.GxxxxV00 where
xxxx is a parm which you enter based on the last version that JES
created if it does not do the catalog for you upon CLOSE) to create
DSN(+1). Since you are creating a new File each day by merging you do
not run the risk of destroying the old dataset by doing a DISP=MOD
Append. Of JES does not update the catalog correctly then you can
always just do the merge from 2 NON-GDG datasets and if you need to
read back in copy the current GDG from input into JES. As has been
stated the max number is 999,999 so you will not run out of numbers
for a while and can trim them off the tape while doing the dupe/copy.
Something I'm not clear on is whether you routinely reload this data,
or of it's just for backup. Any scheme that involves appending to tape
has higher risk and to more data than one that writes to a unique tape
for each day (or other unit of work).
Since the offload does PURGE this has to be an archive with no actual printing.
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN