Fascinating article about a cult of AI rationalists that ended up killing 6
people.

https://www.theguardian.com/global/ng-interactive/2025/mar/05/zizians-artificial-intelligence

For a lot of readers, this might be the first they have heard of the
singularity, effective altruism, utilitarianism, paperclip maximizers,
rationalism, MIRI, CFAR, LessWrong and the Sequences, or transhumanism.
Rationalists are described as atheists with their own versions of heaven
and hell: a godlike AI that uploads your mind to a virtual utopia of your
choosing, or eternal torment if you know anything of Roko's Basilisk,
because according to timeless decision theory, a future AI will simulate
torturing you because you didn't put enough effort into creating it. It
tells the story of a young, idealistic Eliezer Yudkowsky wanting to build
an AI that would end poverty, hunger, and death, forming a community to
help solve the alignment problem, realizing it was hopeless and
unstoppable, and resigning humanity to defeat.

The article follows the Zizians, a cult of young, leftist, vegan,
transgender, technically literate AI doomers as they broke away from the
larger AI community in San Francisco with their extremist views.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T3a4621e83c2a188b-Mabb5f809374313998efcc5b7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to