On Mon, Jul 06, 2020 at 11:15:01PM +0000, таракан wrote:
> >In the strict P2P concept, your friends or contacts may well get
> >annoyed if you keep hitting their phone with requests for the same
> >image for example.
> 
> Caching and decentralized content are 2 different things I guess
> 
> Well this is a mailing list of people who, more or less, are using technology 
> and/or cryptographic science to ensure privacy for "anyone"...
> 
> In such a task, caching appears futile, unnecessary and an obvious risk. 
> Therefore, better not to even considering it, that solves the issues.
> 
> I develop an embedded system for a secure communication station.
> I want everything to stays transient, to be erased as soon and as fast as 
> possible. I don't want that it is possible to scan the memory to intercept 
> any variable, deciphered stuff etc...
> 
> In that communication station, communications can be eventually extremely 
> scarce, internet very slow in extreme environments with all sort of modems 
> involved.
> 
> Since this is a secure communication station, the last thing I want is to 
> cache anything, simply because that is a secure system. The received data are 
> stored in encrypted external components.
> 
> Etc...


Sounds like a relatively high security setup - this would likely be beyond the 
average user.

This distributed/decentral content is interesting - I've been thinking of 
"cache" as the local node's "contribution" to the distributed P2P content store.

In this context, some devices might be considered bridge or entry nodes as Tor 
call's them, but once a user start's viewing content, I had not imagined that 
their device for reading and responding would _not_ cache files, at a minimum 
for a session.

In this early "explore the space" trial, I'm looking at the workings of such an 
app, so I can hopefully get a better understanding of the overall network and 
user needs.

Also, despite the monumental first mover advantage that the incumbent 
_centralised_ social media giants have over everything, I still hope that one 
day something fully decentral might by some fluke of providence be at least 
marginally successful enough to provide an "off the grid" alternative.

So in this decentral context, all "servers" are legacy or at least they "cannot 
be relied upon".

So how do we design for this is the next question - "servers" must be as 
trivial as a hash tag, and this has implications.

Each end user desired node is a use case - there is evident value e.g. in 
separation of one's Internet/overlay entry node, and one's browsing node.

But I think that the use case of a cache-free browsing node needs more 
consideration:

How do we avoid a node "hitting up their peer nodes for the same media object 
repeatedly"?

In P2P, nodes must collectively provide the content store - how do they do this?

Perhaps the "local distributed partial store", if a node has one, is never 
accessed by the local node?  Possible I guess - had not thought of this, and 
there's no reason this could not work if it's useful, but it might bring 
pressure to simplify "offline replication into a secure vault/ encrypted store, 
for offline browsing of your past conversations".

And once you have an offline vault sink, say a mounted Veracrypt volume, then 
why not also use this to store avatar icons?

Once again, we're back to a form of local cache, albeit a little more complex 
for the user to set up - in that case, the pocial media client could be split 
into separate apps if it makes user's feel safer (may be there's already a 
Veracrypt for Android?), or if you have a local "browsing" computer, perhaps a 
"favoured peer node" (on your workstation) could serve all cache requests when 
that node is contactable so that your "browsing" node never caches?

Once a use case is properly understood, it can be properly provided for...


(BTW, in my initial concept tests, I'm implementing a social media client test 
ui, in order to understand the issues.  Networking could run over Tor, I2P, 
clear net, or some future overlay.)

Reply via email to