Greetings list, I just observed some really strange behavior with a tar file in my backup script. Using a command like tar -cf /someplace/www.tar www (when in /usr/local) it produces a 2 Meg file. Under webroot, there are several files that are 500+ meg in size, so this size value is suspect, though it only took a few seconds to create the archive.
The weird part comes in when I untar www.tar. Suddenly all files are there in their full sized glory, leading me to believe that I am only saving file pointers or something. Also, it isn't a compression issue as I am not using compression, and the files themselves really aren't very compressable. Whatever the case, how do I use tar to create reliable backups, including large files? Thanks in advance for any pointers on this. -Derrick _______________________________________________ [EMAIL PROTECTED] mailing list http://lists.freebsd.org/mailman/listinfo/freebsd-questions To unsubscribe, send any mail to "[EMAIL PROTECTED]"