On Sun, Oct 13, 2019, 10:09 AM <[email protected]> wrote:

> Isn't that massively inefficient? It'd take 100 times more
> storage/computation to do the same thing as a weighted net no?
>

The neural models I use in the top ranked text compressors use a lot less
than 12-24 petaflops and a petabyte of RAM. But the language modeling is
rather rudamentary, nowhere near AGI. But I would be happy for you to prove
my estimate wrong.

And one more thing. That's one human brain. To automate all labor, you need
several billion times that. Current technology uses about 1 megawatt per
petaflop. Maybe neuromorphic computing could get it down to 100 kW per
brain. Maybe economy of specialization could reduce it to 1 kW, which is
50% of global energy production. But shrinking transistors alone won't do
it. If we can't do the optimization, it's going to take nanotechnology,
moving atoms instead of electrons. The brain uses 20 watts. It can be done.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Td4a5dff7d017676c-M1ad3ab5408288b2fa6edeff4
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to