On Sun, May 11, 2025 at 12:58 AM Mike Archbold <jazzbo...@gmail.com> wrote:

>  MeTTa has a means of calling Python objects so conceivably anyway you
> could mix in neural/ differentiable that way.
>


It's about efficiency.  Deep learning is currently the most efficient
learning algorithm.  So we should use DL as the learner for the graph
rewriting rules of the AGI (ie, the rules for the AGI's internal
thinking).  Currently the best candidate for this part is Transformer.

But we suspect the Transformer may be doing its job in a roundabout way
that is very inefficient.  So we want to improve the Transformer, ie,
rewrite its string diagram.  This is the "functorial" part, ie, rewriting
the AGI system which is itself a rewriting machine.  If we know how to do
the latter, we can "gradient descend" to the most efficient AGI.

Unfortunately I don't know how to figure out the functorial details.  Maybe
someone more adept at category theory can do that...

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tdc5c19d0f38aacd6-M6383c9ab81728d48dbf24612
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to