Il 2022-05-21 11:41 380° ha scritto:

possibile che in Italia non ci sia praticamente nessuno, nemmeno in
GARR, che abbia il coraggio di dire che Internet FA SCHIFO [2] e deve
essere rifatta (quasi) da zero dal punto di vista dei protocolli di
comunicazione dei dati (quindi software, /quindi/ è fattibile), in modo
tale che sia impossibile che "tutto quello che facciamo venga in
qualche
modo indicizzato e catalogato affinché QUALCUNO possa accedervi"?


A questo proposito segnalo un interessante libro di pochi anni fa:

Marco Aiello (Univ. Stuttgart) - http://aiellom.it/
The Web Was Done by Amateurs

grazie del riferimento, non lo conoscevo

https://link.springer.com/book/10.1007/978-3-319-90008-7?noAccess=true

--8<---------------cut here---------------start------------->8---

This book stems from the desire to systematize and put down on paper
essential historical facts about the Web, a system that has undoubtedly
changed our lives in just a few decades. [...] How did it evolve from
its roots to today? Which competitors, if any, did it have to beat out?
Who are the heroes behind its success?

--8<---------------cut here---------------end--------------->8---

non avendolo ancora letto non so come il tema poi venga sviluppato, in
particolare non so come viene analizzato "The Pacific-Ocean Internet",
ma mi pare fondamentalmente incentrato sull'analisi del web e non di
internet: Enrico hai qualche dettaglio in più da condividere per favore?


C'è questo video
https://www.youtube.com/watch?v=a28TaKmbWLI&list=PLKsY-6BoMQ8InyDlWfUNzTb4uzqMr9f2N&index=3

dell'intervento di Marco Aiello al Digital Humanism workshop del 2019
  https://dighum.ec.tuwien.ac.at/workshop/

dove abbiamo prodotto il relativo "Manifesto di Vienna"
  https://dighum.ec.tuwien.ac.at/dighum-manifesto/
di cui penso di aver già parlato qua in lista.

Ciao, Enrico




in particolare, nell'abstract del Capitolo 1 [1] leggo:

--8<---------------cut here---------------start------------->8---

In 2012, the ACM Turing Award winner Alan Kay released an interview with Dr. Dobb’s Journal in which he stated “the Web was done by amateurs.” In this first chapter, we look at possible motivations for such a statement
[...]

--8<---------------cut here---------------end--------------->8---

Quindi il libro parte da una frase di Alan Kay in una intervista del
2012 [2]... /ma/ la frase esattamente è questa:

--8<---------------cut here---------------start------------->8---

the Internet was done so well that most people think of it as a natural
resource like the Pacific Ocean, rather than something that was
man-made. When was the last time a technology with a scale like that was
so error-free? The Web, in comparison, is a joke. The Web was done by
amateurs.

--8<---------------cut here---------------end--------------->8---

"The (Pacific Ocean) Internet was so well done..." è esattamente
l'opposto rispetto a quello che è stato /ampiamente/ verificato :-O

AFAIU Alan Kay in quella intervista non entra nel merito del perché
ritiene che Internet "was well done", forse anche lui lo da per
scontato come lo si da per scontato con la natura?

su Art Technica ho trovato un articolo del 2013 di "esegesi" di quella
frase: «Parsing the difference between the Internet and the Web
according to Alan Kay - Kay thinks the Internet was built better than
the Web. Is he right?» [3]

di quell'articolo io sottoscrivo in pieno questa interpretazione:

--8<---------------cut here---------------start------------->8---

I read this as Kay being unfamiliar enough with the lower level
protocols to assume they're significantly cleaner than the higher level
Web. The “designed by professionals” era he's talking about still had
major problems with security (spoofing is still too easy), reliability,
and performance, [...] if you look at one and see genius design, you're
not looking closely enough.

--8<---------------cut here---------------end--------------->8---

chiedo scusa se mi ripeto, ma io insisto a indicare questo come
riferimento: https://secushare.org/broken-internet; perché per ciascun
layer di internet è riassunto abbastanza precisamente ma senza entrare
troppo nel tecnico qual'è la natura dei problemi di progettazione e
implementazione dei "bulding blocks" di internet, **a partire da BGP**

Infine, nella pagina principale del workshop STRINT di WC3/IAB svolto
nel 2014 [4] c'è scritto

--8<---------------cut here---------------start------------->8---

A W3C/IAB workshop on Strengthening the Internet Against Pervasive
Monitoring (STRINT)

The Vancouver IETF plenary concluded that pervasive monitoring
represents an attack on the Internet, and the IETF has begun to carry
out various of the more obvious actions required to try to handle this
attack. However, there are additional much more complex questions
arising that need further consideration before any additional concrete
plans can be made. [...]

Pervasive monitoring targets protocol data that we also need for network
manageability and security. This data is captured and correlated with
other data. There is an open problem as to how to enhance protocols so
as to maintain network manageability and security but still limit data
capture and correlation. [...]

--8<---------------cut here---------------end--------------->8---
(si veda anche
https://web.archive.org/web/20170718101836/https://down.dsg.cs.tcd.ie/misc/perpass.txt)

In quel workshop venne presentato anche questo paper:
«The Internet is Broken: Idealistic Ideas for Building a GNU Network»
https://www.w3.org/2014/strint/papers/65.pdf

che inizia spiegando:

--8<---------------cut here---------------start------------->8---

The Internet is broken, by design. Recent revelations about the abuses
by the NSA or GHCQ rarely contain stunning details about new magical
technical capabilities of these agencies, but instead merely detail that
they have both the budget and the moral framework to exploit known
vulnerabilities at scale. The problems of today’s Internet start at the
Ethernet layer, where sender’s MAC-48 addresses can be faked, and
packets can be intercepted and modified by switches. The TCP/IP layer
has the same problems, with routers learning source and destination of
all communications, as well as details about the payload (such as port
numbers). Routers can also interfere with connections, for example by
injecting RST packets. TCP also operates with the assumption that other
traffic will be “TCP-friendly”, which is like having a speed limit on
roads without enforcement. TLS, the workhorse for today’s “Internet
security”, provides “security” only if all of hundreds of certificate
authorities operate correctly (which they usually do not), and it comes
with a large set of supported cryptographic primitives, most of which
are known to be insecure.  All of the above facts are well-known and
even discussed in ordinary news venues before Edward Snowden decided to
expose some of the systemic abuses supported by these design flaws.

--8<---------------cut here---------------end--------------->8---

quindi, ripeto, NSA, GHCQ e i governi degli stati brutti e cattivi (fate
voi l'elenco) NON impiegano "risorse tecniche magiche" per fare quello
che fanno, ma sfruttano **vulnerabilità note** sfruttabili da chi, come
loro, ha sufficienti risorse per organizzare tali attacchi, sia mirati a
specifiche persone che generalizzati, di massa.

serve più tecnologia... e /migliore/... e servirebbe in fretta :-(



Saluti, 380°


[1] https://link.springer.com/chapter/10.1007/978-3-319-90008-7_1

[2]
https://web.archive.org/web/20120712231854/https://www.drdobbs.com/architecture-and-design/interview-with-alan-kay/240003442

[3]
https://arstechnica.com/information-technology/2013/06/parsing-the-difference-between-the-internet-and-the-web-according-to-alan-kay/

[4] https://www.w3.org/2014/strint/
_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to