Matt,

What do you mean "each session creates a private copy"?

On Mon, May 12, 2025 at 12:52 AM Matt Mahoney <mattmahone...@gmail.com>
wrote:

> YKY, are you doing any experiments with transformers or neural networks,
> or still pursuing an old school symbolic approach to knowledge
> representation? Have you written any code to test your ideas?
>
> LLMs today have a huge imbalance in computation power requirements for
> training over prediction because each session creates a private copy. You
> shouldn't need a building full of GPUs and your own power plant and cooling
> towers to run the kind of experiments you need to do. Training and
> prediction should run at the same speed. You only need a few GB of text to
> reproduce human level text prediction.
>
> -- Matt Mahoney, mattmahone...@gmail.com
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tdc5c19d0f38aacd6-M3f243eb916251b4567153cb4
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to