On Sat, Aug 10, 2019 at 11:14 AM Ben Goertzel <[email protected]> wrote:
>
> The point is, Matt, you can't copy my quantum predictor without me
> knowing you were copying it.   Basic principle of quantum
> cryptography.
>
> This is irrelevant to AGI though, just a sorta fun thought experiment...

Actually it is relevant. For every learner, there is a sequence with
about the same complexity that it can't learn. It doesn't depend on my
ability to make a copy of your program or on whether you know that I
did. The sequence still exists.

The paper goes beyond my proof. Powerful predictors exist, but they
are necessarily highly complex. https://arxiv.org/abs/cs/0606070

-- 
-- Matt Mahoney, [email protected]

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T1ff21f8b11c8c9ae-M1dfe2f79d166eff464f37fe5
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to