At 8/24/2003 19:07 -0400, you wrote:
Sorry, I'm joining this thread way after the fact.  The only thing I'll
mention is that I *have* seen certain applications zero out a very large
filesize in preparation for filling up that space with a series of
chunks.  Bit-torrent is the *perfect* example of that.  Say you start to
download a 500M ISO image.  It breaks it into chunks so it can perform
parallel downloads from multiple clients.  Even though the total
download at any one time may only be a fraction of that size, the file
is reserved at its maximum size.  I don't know how it does it, but it
does.  :)

Does this sound like a possibility?

Not at all... these are standard WAV files, originally ripped from the (original, purchased) music CD. They average 45-50MB, but when I moved them to a second hard drive some of them started getting reported by "ls -l" as being roughly 20 times larger (900MB to 1.2GB). Oddly, "ls -sh" reports their sizes correctly, as does "du -h".


Trying to figure out what caused this wrong listing and fix it, since copying the file does take the whole 1.2GB. Also, I share this folder with Windows which reports total usage as 1.6TB instead of the actual 63GB (25x).


-- Rodolfo J. Paiz [EMAIL PROTECTED]


-- redhat-list mailing list unsubscribe mailto:[EMAIL PROTECTED] https://www.redhat.com/mailman/listinfo/redhat-list

Reply via email to