On Mon, Dec 16, 2024 at 6:42 AM Matt Mahoney <mattmahone...@gmail.com>
wrote:

> YKY, what is the title of your thesis? What is the central idea you are
> trying to prove? What unknown knowledge do you hope to discover? Are you
> going to do any experimental work to prove your ideas, or is this purely a
> math paper?
>

Working title is:  Algebraic Logic for Deep Learning.  The goal is to build
logic-based AI from deep learning.  Yes, I will do experiments on TicTacToe.


> I know you have been working on a symbolic approach to AGI for a couple of
> decades now. The reason everyone was doing this in the 1980's was because
> we had no choice. We didn't have either the hardware or the training data
> to run LLMs. It wasn't until we used neural networks that we solved
> problems like brittleness, ambiguity, and the lack of a learning algorithm
> for language models. We have known about these problems since the late
> 1950's. Given the amount of work that has gone into solving them, do you
> have any new insights?
>

Deep learning is revolutionary for the single reason:  it learns faster,
making it possible to learn from huge datasets, previously impossible with
symbolic methods.  I remake symbolic AI with neural networks to gain the
same learning efficiency.  So it'd be different this time.  Also this
approach enables to use the intuition we had about logic-based systems to
build AGI.  Right now, the Transformer is state-of-the-art, but everyone
seems beguiled by how to interpret its mechanism.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T8685950780e86bd5-Mcb61cef24e02991d69b3e2f6
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to