*OII | Large Language Models pose a risk to society and need tighter
regulation, say Oxford researchers*
Written by
Sandra Wachter, Brent Mittelstadt and Chris Russell
/Leading experts in regulation and ethics at the Oxford Internet
Institute, part of the University of Oxford, have identified a new type
of harm created by LLMs which they believe poses long-term risks to
democratic societies and needs to be addressed/
Large Language Models pose a risk to society and need tighter
regulation, say Oxford researchers
Leading experts in regulation and ethics at the Oxford Internet
Institute, part of the University of Oxford, have identified a new type
of harm created by LLMs which they believe poses long-term risks to
democratic societies and needs to be addressed by creating a new legal
duty for LLM providers.
In their new paper ‘Do large language models have a legal duty to tell
the truth?’, published by the Royal Society Open Science, the Oxford
researchers set out how LLMs produce responses that are plausible,
helpful and confident but contain factual inaccuracies, misleading
references and biased information. They term this problematic
phenomenon as ‘careless speech’ which they believe causes long-term
harms to science, education and society.
continua qui:
https://www.oii.ox.ac.uk/news-events/do-large-language-models-have-a-legal-duty-to-tell-the-truth/