Hi Ludo, Just one interjection: wow! :-)
Ludovic Courtès <l...@gnu.org> writes: > Hello Guix! > > Every time a package changes, we end up downloading complete substitutes > for itself and for all its dependents, even though we have the intuition > that a large fraction of the files in those store items are unchanged. It's great you're taking a look into these kind of optimizations, as they also close the gap between only-binary distribution and the substitutes system. > [Awesome data collection omitted for brevity] > > Thoughts? :-) Probably you're already aware of it, but I want to mention that Tridgell's thesis[1] contains a very neat approach to this problem. A naive prototype would be copying of the latest available nar of the package on the client side and using it as the destination for a copy using rsync. Either the protocol used by the rsync application, or a protocol based on those ideas, could be implemented over the HTTP layer; client and server implementation and cooperation would be needed though. Another idea that might fit well into that kind of protocol---with harder impact on the design, and probably with a high cost on the runtime---would be the "upgrade" of the deduplication process towards a content-based file system as git does[2]. This way the a description of the nar contents (size, hash) could trigger the retrieval only of the needed files not found in the current store. Nonetheless, these are only thoughts, I'll ping back if and when I have something more tangible. ;-) Happy hacking! Miguel [1] https://rsync.samba.org/~tridge/phd_thesis.pdf [2] https://git-scm.com/book/en/v2/Git-Internals-Git-Objects