I have been watching the thread about the file::copy. I ran into an issue in the Linux environment that brings a serious question, MAX file size. Keep in mind the server is running 7.0 RH, we have 7.2 Enterprise Server also, and we pay for support. But even the RH support says they can't handle files in excess of 2GB (approx). I was using TAR, GZIP, or most any functions, I have found that the targeted file is only 1.8GB instead of being a much larger file, in our case 16GB. This was on a "/mnt" device, not a local disk. So the COPY (TAR in this case) was from one "/mnt/" device to another, it did not matter if I used TAR, COPY, MOVE, or a Perl program, same problem.
Everyone I talked to about this on the various "Groups" only said "Rebuild the kernel using 64 bit support", but this is on an Intel box (32 bit?). Have any of YOU seen this problem? I can't be the only person dealing with large files. Ideas?? How is this issue on later releases??
THanks. -- Rich Parker
-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]