On 7 Aug 2009, at 04:56, Mike Kazantsev wrote:
...
Note that this problem can also be (easily?) solved on software level by
pre-allocating files (like "dd if=/dev/zero of=file").

Sure, that won't make writes sequential, but that should guarantee that
resulting file would be as non-fragmented as fs allows at a time of
it's creation.

In fact, rtorrent (and libtorrent) seem to have such a feature, prehaps
other clients should have it somewhere, as well.

http://libtorrent.rakshasa.no/ticket/460

I think this went out of fashion after BitTorrent clients became clever / advanced enough to download single files.

I'm old enough to remember the days when opening a torrent would download the only the whole thing. If the torrent contained several files (mp3s, for instance), of which you wanted only one, then tough luck - the client would download random chunks of all the files until it had 100% of all of them, and the chances were that the one single file you wanted would be incomplete until the whole torrent was at least 99% finished (and there was no easy way to tell, anyway; you just had to download the whole lot).

Once BitTorrent clients added the feature to select individual files for download out of the "compilation", this became quite a popular use of them amongst the general public (who are not, as a rule, downloading Linux CDs) and led to complaints about all the space being "wasted" by preallocation in this way. I gather that many BitTorrent users may be interested in only 5% of a typical complete torrent.

I don't use BitTorrent as actively as I used to, but my recollection is that NOT pre-allocating the space was a "feature" that was ADDED to the more sophisticated clients. Ideally it should indeed be an option, but it may not be ubiquitous.

Stroller.


Reply via email to