46 half precision (16 bit) TFLOPS is nice for neural networks but I doubt
you could run AGI with only 16 or 24 GB of RAM. If you could upgrade each
board with 1 TB of SSD and put together 500 or so boards then maybe you
could run a human brain sized neural network for about $1 million and 200
KW of electricity.

That price is almost competitive with human labor, but I doubt that Moore's
law will help much more. They are already pushing the physical limits of
transistor sizes. 7 nm is 63 silicon atoms. Clock speeds are no faster than
in 2010.

On Wed, Oct 28, 2020, 6:23 PM <[email protected]> wrote:

> ...so....
> How faster can it run GPT-2? If you can't show us, we'll never know, these
> numbers can mean something different than when used in practice no? We need
> a test. You can say 600 or 99999 all you want, but it doesn't tell me much.
> How faster is todays computers?
> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T079f29fb2fb0c97c-Mb67871ee394810856a3026ba>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T079f29fb2fb0c97c-M63d4e97bc5c41fd50338a791
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to