We maintain a FOSS file system [1] for users of our platform. One of
the challenges, as we move move functionality into the client, is
dealing with our copy-on-write volume-cloning. To get implicit
copy-on-write, we reference SHA512 hashes of file content. Same
content, same copy.

For a next-generation design, the DHT support in GNUnet would fit our
use-case well. But, I'm concerned about guaranteeing that a file is
available within the sort of F2F networks we'd be setting up. Is there
any way to guarantee that, aside from republishing constantly? Is
there interest in support for secondary backends, like S3 (with or
without encryption of the content), in order to provide arbitrarily
large storage?

[1] https://github.com/pantheon-systems/fusedav

_______________________________________________
GNUnet-developers mailing list
GNUnet-developers@gnu.org
https://lists.gnu.org/mailman/listinfo/gnunet-developers

Reply via email to