Buongiorno

la "nudge theory" e' oggetto di revisione importante visto che chiaramente non 
funziona.  Mentre e' semplice spingere la gente a non cambiar lenzuola o 
asciugamano in albergo, ben documentato da Cialdini, andare a cambiare il 
comportamento o i valori non solo di una persona ma di una fascia di 
popolazione non funziona. 
https://www.ft.com/content/a23e808b-e293-4cc0-b077-9168cff135e4


Puo' piacere molto, a venditori, politici e chi altro, poter dire di essere in 
grado di convincere "la gente", o peggio che mai di doverla proteggere dalle 
manipolazioni del demonio di turno.  Non funziona: in presenza di molteplici 
fonti la persona riesce a capire e decidere con la sua testa


ciao

rob

________________________________
Da: nexa <nexa-boun...@server-nexa.polito.it> per conto di 380° 
<g...@biscuolo.net>
Inviato: mercoledì 25 maggio 2022 10:46:41
A: Stefano Quintarelli; Nexa
Oggetto: Re: [nexa] "The Liberal Obsession With ‘Disinformation’ Is Not Helping"

Buongiorno Stefano e nexiane,

mi scuso se, come al solito, faccio esattamente il contrario di quanto
hai fatto tu Stefano: sono /prolisso/

executive summary: c'è la concreta possibilità che le teorie sulla
manipolazione comportamentale (aka propaganda deterministica) siano
pseudoscienza [1] [4], più o meno come la programmazione
neurolinguistica;  last but not least, queste idee sono nipotine di
teorie degli anni '20 del secolo scorso

...certo è che chi usa queste teorie per far business adesso ha un
enorme successo, infatti l'AdTech ha generato il DisinfoTech :-D

Stefano Quintarelli <stef...@quintarelli.it> writes:

> mah

Scusa Stefano ma non ho capito il tuo commento: mah cosa?

>  > 
> https://gizmodo.com/russian-botnet-spam-social-media-report-nisos-fake-news-1848956529
>  >
>  > This Russian Botnet Is Capable of Manipulating Social Media Trends on a 
> 'Massive
>  > Scale,' Report Claims

Sicuramente sbaglio l'esegesi del tuo "mah"... tu staresti contrastando
gli argomenti dell'articolo riportato da J.C. citando un report di che
con tutta probabilità viene da quella che il citato Joe Bernstein [1]
nel Settembre 2021 definiva “antidisinformation industry” (aka "Big
Disinfo")?

--8<---------------cut here---------------start------------->8---

As Joe Bernstein documented in Harper’s last year, the
“antidisinformation industry” has attracted massive investment from
wealthy Democratic donors, the tech industry, and cash-rich
foundations. Hundreds of millions of disinfo dollars are sloshing around
the nonprofit world, funding institutes at universities and extravagant
conventions across the world.

--8<---------------cut here---------------end--------------->8---

cito dall'articolo [1]:

--8<---------------cut here---------------start------------->8---

But she is most worried about disinformation, because it seems so new,
and because so new, so isolable, and because so isolable, so fixable. It
has something to do, she knows, with the algorithm. [...]

Just as, say, smoking causes cancer, consuming bad information must
cause changes in belief or behavior that are bad, by some
standard. [...]

Behold, the platforms and their most prominent critics both proclaim:
hundreds of millions of Americans in an endless grid, ready for
manipulation, ready for activation. Want to change an output—say, an
insurrection, or a culture of vaccine skepticism? Change your
input. Want to solve the “crisis of faith in key institutions” and the
“loss of faith in evidence-based reality”? Adopt a better
content-moderation policy.  The fix, you see, has something to do with
the algorithm. [...]

Luckily for the aspiring Cold War propagandist, the American ad industry
had polished up a pitch. It had spent the first half of the century
trying to substantiate its worth through association with the burgeoning
fields of scientific management and laboratory psychology. Cultivating
behavioral scientists and appropriating their jargon, writes the
economist Zoe Sherman, allowed ad sellers to offer “a veneer of
scientific certainty” to the art of persuasion: [...]

The most comprehensive survey of the field to date, a 2018 scientific
literature review titled “Social Media, Political Polarization, and
Political Disinformation,” reveals some gobsmacking deficits. The
authors fault disinformation research for failing to explain why
opinions change; lacking solid data on the prevalence and reach of
disinformation; and declining to establish common definitions for the
most important terms in the field, including disinformation,
misinformation, online propaganda, hyperpartisan news, fake news,
clickbait, rumors, and conspiracy theories. The sense prevails that no
two people who research disinformation are talking about quite the same
thing. [...]

These stories of persuasion are, like the story of online advertising,
plagued by the difficulty of disentangling correlation from
causation. Is social media creating new types of people, or simply
revealing long-obscured types of people to a segment of the public
unaccustomed to seeing them? The latter possibility has embarrassing
implications for the media and academia alike. [...]

Still, Big Disinfo can barely contain its desire to hand the power of
disseminating knowledge back to a set of “objective” gatekeepers. [...]

The vision of a godlike scientist bestriding the media on behalf of the
U.S. government is almost a century old. After the First World War, the
academic study of propaganda was explicitly progressive and reformist,
seeking to expose the role of powerful interests in shaping the
news. Then, in the late 1930s, the Rockefeller Foundation began
sponsoring evangelists of a new discipline called communication
research. [...]

As a matter of policy, it’s much easier to focus on an adjustable
algorithm than entrenched social conditions. [...]

It is a model of cause and effect in which the information circulated by
a few corporations has the total power to justify the beliefs and
behaviors of the demos. In a way, this world is a kind of comfort.  Easy
to explain, easy to tweak, and easy to sell, it is a worthy successor to
the unified vision of American life produced by twentieth-century
television.

--8<---------------cut here---------------end--------------->8---

Quindi davvero è principalmente un problema di Bot (botnet?) Russi e di
/algoritmo/, che possiamo tecnorisolvere?

... o peggio lo risolviamo con obrobri come "Homeland Security’s
Disinformation Governance Board" [2], "Commission on Information
Disorder" per ristabilire la fede [3] o "Project for Good Information" [1]

Ma chi si inventa 'sti nomi si rende conto che Orwell ha avuto poca
creatività?!?


Saluti, 380°


[...]


[1] Consiglio caldamente la lettura:
https://harpers.org/archive/2021/09/bad-news-selling-the-story-of-disinformation/
«Bad News - Selling the story of disinformation»

[2] è davvero Nina Jankowicz il problema?  Mettiamo qualcuno meno
divisivo e va tutto bene?

[3] tratto da [1]: «Among the commission’s goals is to determine “how
government, private industry, and civil society can work together... to
engage disaffected populations who have lost faith in evidence-based
reality,” faith being a well-known prerequisite for evidence-based
reality.

[4] non è un caso che la psicologia è "la madre" di tutto il casino che
sta uscendo in merito alla "science crisis", il cui nocciolo (NON a
caso) è perfettamente illustrato in una sola frase tratta da [1]:
«Mistaking correlation for causation has given ad buyers (e tutti gli
psicologi comportamentisti, n.r.d.) a wildly exaggerated sense of their
ability to persuade.»

--
380° (Giovanni Biscuolo public alter ego)

«Noi, incompetenti come siamo,
 non abbiamo alcun titolo per suggerire alcunché»

Disinformation flourishes because many people care deeply about injustice
but very few check the facts.  Ask me about <https://stallmansupport.org>.
_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to