Hello Marna,

I use:-

Rules for re-sizing datasets:-                                          
                                                                        
    Target volume datasets to be allocated in CYLINDERS                 
    Distribution volume datasets to be allocated in TRACKS              
                                                                        
    LINKLIST datasets:-                                                 
      o Primary allocation   = 3 times Used space                       
      o Secondary allocation = 0                                        
      o PDS Directory Blocks = 1.5 times used directory blocks          
                                                                        
    SPECIAL  datasets:-                                                 
      o Primary allocation   = 1.4 times Used space                     
      o Secondary allocation = 0                                        
      o PDS Directory Blocks = 1.5 times used directory blocks          
                                                                        
    PROCLIB datasets:-                                                  
      o Primary allocation   = 3 times Used space                       
      o Secondary allocation = 0                                        
      o PDS Directory Blocks = 2.0 times used directory blocks          
                                                                        
    OTHER datasets:-                                                    
      o Primary allocation   = 1.2 times Used space                     
      o Secondary allocation = 0.5 times Used space                     
      o PDS Directory Blocks = 2.0 times used directory blocks          
                                                                        
    All calculations to be performed in TRACKS, rounded up.             
    Conversion TRACK to CYLINDER to be rounded up.                      


Regards
Bruce



On Tue, 20 Jul 2021 12:47:54 -0500, Marna WALLE <[email protected]> wrote:

<snip>
>
>Now...I would like to look at the data set size problem in a larger context - 
>in order to understand where to solve this problem.  More than ever, we have 
>been shipping Continuous Delivery PTFs.  Many of these PTFs are quite large, 
>and occur over the life of a release.  This can put quite a lot of pressure on 
>the size of the target and DLIB data sets being able to accommodate these 
>updates for every service install episode.  I am wondering, if it might be of 
>better use to have the capability of accommodating the need for more space in 
>a more ongoing manner?  Meaning, installing a release for a first time - even 
>with enlarging the data sets with some predictive percentage (50%, 100%, 
>200%?) - still doesn't completely help with running out of space in some data 
>sets or even volumes continually, and could result in some data sets being 
>overly and unnecessarily large.  Would it be better if z/OS itself was able to 
>assist better when the problem occurred in a targeted and timely fashion?  Do 
>you feel that if z/OSMF Software Management provided this ability to one-time 
>increase the size of allocated target and DLIBs, that would conclusively solve 
>your space problems for these data sets?
>
>-Marna WALLE
>z/OS System Install and Upgrade

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to