On Mon, Nov 25, 2024 at 1:44 PM Matt Mahoney <mattmahone...@gmail.com>
wrote:

>
> https://asteriskmag.com/issues/08/looking-back-at-the-future-of-humanity-institute
>
> The article doesn't say so, but I think the reason that the Future of
> Humanity Institute was shut down was because its predictions of AI doom
> were plain wrong.
>
...
At the start of the pandemic I approached the FHI with the Algorithmic
Information Criterion as the basis for a prize to compress all data
relevant to the pandemic. The response?

Dynamical systems are so complex that you can't project into the future
from the present.

Morons.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te06c304d8f0fb5e4-M90b5db971be9668d97366c66
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to