I am now generating NLP patterns (see https://tomii.me/nlp%20patterns.html, the 
language is now called "PhraseCache") from just examples and counterexamples 
with an automatic procedure.

Screenshot: https://botcompany.de/images/1102925

It works like this. You have *generators* and *collectors*. In this case, NLP 
patterns are generated and collected.

The collectors just keep the best candidate... or the best in several classes 
(e.g. complexity classes). or multiple best ones that are notably distinct 
(that last part seems harder to actually define).

The generators either create fresh patterns from parts of the input ("seeding")
or they take a known good pattern and try to improve it, e.g. by combination 
with some other pattern. Then they feed these improvements to the collectors 
and see what sticks.

Probably you want to have large collectors during the computation and 
eventually shrink them to a handful near the end. Thats generally how a thought 
process works... you start with a relatively simple question, then make up a 
huge bubble of ideas in your head and finally try to distill a relatively 
simple answer, is it not.

Interesting thing is that this actually seems to work, and it's not *that* 
complicated internally either.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Te1993278993de052-M706aeb999254eb403963a9c5
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to