Another consideration is how many times the compressed file would be transferred. We used to host lots of documents on our mainframe to be served out on a website. When the transfer load became noticeable on the performance reports, we started compressing the most common documents. The transfer load dropped dramatically and Management decided to compress all documents before loading them into the website.

/Tom Kern

On 06/30/2019 12:45, Donald Russell wrote:
I???m not considering the cost of compression  in relation to the transfer
savings because the size of the files is huge (several million lines of
text) that compress really well. Pkzip/gzip seems to get well over 80%
compression. Then yes, after the mvs job step runs, the ftp target is in
another city or even continent, and the ftp traffic is encrypted inflight
using ftps.

My goal is to to compress the text file prior to ftp.

Can bpxbatch programs like tar read/write from/to dd names, or fully
qualified dataset names instead of Unix-like file paths?

Don


On Sun, Jun 30, 2019 at 09:19 Steve Thompson <[email protected]> wrote:

If this file is being sent inside your firewall, the time and CPU cycles
will cost more than the ftp. This is based on experiences using MFT
products. (Basically what Gadi said).

We found in testing that compressing was really only useful with small
pipes. Of course, there is a ratio between number of bytes to transfer and
bandwidth in determining the effectiveness of the compression (and
compression method).

Now, if this is confidential data, and is going outside of your firewall,
you have to consider encryption. Compress first, then encrypt, because
encrypted data is generally uncompressable.

HTH
Steve Thompson

Sent from my iPhone ??? small keyboarf, fat fungrs, stupd spell manglr.
Expct mistaks


On Jun 30, 2019, at 12:05 PM, Gadi Ben-Avi <[email protected]> wrote:

If both systems are on the same physical computer, it might not be worth
it.
The time and cpu cycles it would take to compress and uncompress might
take longer than transferring the un compressed file.


-----Original Message-----
From: IBM Mainframe Discussion List <[email protected]> On
Behalf Of Donald Russell
Sent: Sunday, June 30, 2019 6:58 PM
To: [email protected]
Subject: Using bpxbatch to compress an MVS dataset

I have a batch process in zOS 2.1 (soon to be 2.3) that creates a large
text file I want to FTP to a zLinux system.
How can I use bpxbatch tar or compress (or ?) to create a smaller file I
can ftp instead instead of the original file? I don???t want to use pkzip
unless that???s the only choice. Terse is no good because Linux can???t unterse
it.
Is there a way to specify a DD name for the input and output files,
similar to how FTP allows put/get //DD:<dd name>
Part two... the text in the file is EBCDIC, but Linux wants ASCII. I
don???t see an option to do the conversion.
I???ll have to check tr, but maybe there???s a way to use more traditional
Unix syntax like
cat //dd:in | tr ... | tar -cv //dd:out

Cheers,
Don

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions, send
email to [email protected] with the message: INFO IBM-MAIN
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN
----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN


----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to