le falsità sono sempre esistite
anche le credenze non riscontrabili
imho il problema non è tanto il falso ma la sua diffusione e codice (aka scala, 
velocità e rimedio) 

buona estate

Il 7 agosto 2024 13:54:20 UTC, Fabio Alemagna <falem...@gmail.com> ha scritto:
>Mi sembra la descrizione perfetta del comportamento di un politico medio.
>Se dovessimo  obbligare per legge gli LLM a dire la verità, per quale
>ragione dovremmo esentare i politici dal fare altrettanto?
>
>Il mer 7 ago 2024, 12:55 J.C. DE MARTIN <juancarlos.demar...@polito.it> ha
>scritto:
>
>> *OII | Large Language Models pose a risk to society and need tighter
>> regulation, say Oxford researchers*
>>
>> Written by
>> Sandra Wachter, Brent Mittelstadt and Chris Russell
>>
>> *Leading experts in regulation and ethics at the Oxford Internet
>> Institute, part of the University of Oxford, have identified a new type of
>> harm created by LLMs which they believe poses long-term risks to democratic
>> societies and needs to be addressed*
>>
>> Large Language Models pose a risk to society and need tighter regulation,
>> say Oxford researchers
>>
>> Leading experts in regulation and ethics at the Oxford Internet Institute,
>> part of the University of Oxford, have identified a new type of harm
>> created by LLMs which they believe poses long-term risks to democratic
>> societies and needs to be addressed by creating a new legal duty for LLM
>> providers.
>>
>> In their new paper ‘Do large language models have a legal duty to tell the
>> truth?’, published by the Royal Society Open Science, the Oxford
>> researchers set out how LLMs produce responses that are plausible, helpful
>> and confident but contain factual inaccuracies, misleading references and
>> biased information.  They term this problematic phenomenon as ‘careless
>> speech’ which they believe causes long-term harms to science, education and
>> society.
>>
>> continua qui:
>> https://www.oii.ox.ac.uk/news-events/do-large-language-models-have-a-legal-duty-to-tell-the-truth/
>>

Reply via email to