On Mon, May 12, 2025 at 12:52 AM Matt Mahoney <mattmahone...@gmail.com>
wrote:

> YKY, are you doing any experiments with transformers or neural networks,
> or still pursuing an old school symbolic approach to knowledge
> representation? Have you written any code to test your ideas?
>

Of course not old-school symbolic... that's a dead end (as far as our
current knowledge can see).  I invented a *differentiable* version of
symbolic AI... but I found a big bug in it, so big that I have to abandon
the entire predicate-logic form and use a different graph-based
representation.  I have about 24 hours left to submit the paper...

Yes, the paper comes with code too, testing on Tic Tac Toe.  It was during
the coding that I found the bug.

LLMs today have a huge imbalance in computation power requirements for
> training over prediction because each session creates a private copy. You
> shouldn't need a building full of GPUs and your own power plant and cooling
> towers to run the kind of experiments you need to do. Training and
> prediction should run at the same speed. You only need a few GB of text to
> reproduce human level text prediction.
>

Like many people, I too suspect that Transformers are unnecessarily
"bloated" and my theory is that it is trying to do some symbolic
manipulations in a very roundabout, inefficient way.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tdc5c19d0f38aacd6-M93b935c5813b65f129dd2cbc
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to