On Jul 28, 12:53 pm, "Dr. David Kirkby" <david.kir...@onetel.net>
wrote:
> There
> should be better ways of handling big files on unreliable connections. I think
> my splitting the file into 99 parts for you was right to solve your immediate
> problem, but its not a long term solution.

I just want to repeat that in my eyes, using the metalink file would
solve all of this. Just look into the content of this file [1] (it's
plain text XML) and you can see that the entire zip file is split into
247 parts (i.e. 4 MB each) with individual hashsums. There are clients
like aria2 ([2], available for linux & windows) that can make use of
that information to request these parts individually from the given
list of http+ftp servers (fyi, these transfer protocols are able to
handle requests for a certain part of a big file) ... then check the
pieces, re-download some if necessary and after a download has
finished, be even able to validate+repair a file. matalinks are also
on track to be standardized as a RFC: http://tools.ietf.org/html/rfc5854

[1] http://www.sagemath.org/mirror/win/meta/sage-vmware-4.4.alpha0.zip.metalink
[2] http://aria2.sourceforge.net/

I'm not aware of a better solution to handle this problem ...

H

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to