Hi Matt,
Doesn't the "predictor" actually contains trained models as well?






--
-----------------------------




At 2018-09-10 23:25:54, "Matt Mahoney via AGI" <agi@agi.topicbox.com> wrote:
>On Mon, Sep 10, 2018 at 8:10 AM <johnr...@polyplexic.com> wrote:
>> Why is there no single general compression algorithm? Same reason as general 
>> intelligence, thus, multi-agent, thus inter agent communication, thus 
>> protocol, and thus consciousness.
>
>Legg proved that there are no simple, general theories of prediction,
>and therefore no simple but powerful learners (or compression
>algorithms). Suppose you have a simple algorithm that can predict any
>computable infinite sequence of symbols after only a finite number of
>mistakes. Then I can create a simple sequence that your program can't
>learn. My program runs your program and outputs a different symbol at
>each step. You can read his paper here:
>https://arxiv.org/abs/cs/0606070
>
>This has been the biggest pitfall of AGI projects. You make fast
>progress initially on the easy problems, thinking the solution is in
>sight, and then get stuck on the hard ones.
>
>> Doesn't Gödel Incompleteness imply "magic" is needed?
> 
> No, it (and Legg's generalizations) implies that a lot of software and
> hardware is required and you can forget about shortcuts like universal
> learners sucking data off the internet. You can also forget about self
> improving software (violates information theory), quantum computing
> (neural computation is not unitary), or consciousness (an illusion
> that evolved so you would fear death).
> 
> How much software and hardware? You were born with half of what you
> know as an adult, about 10^9 bits each. That's roughly the information
> content of your DNA, coincidentally about the same as your long term
> memory capacity according to Landauer. (see
> https://www.cs.colorado.edu/~mozer/Teaching/syllabi/7782/readings/Landauer1986.pdf
> ). All this debate about nurture vs nature is because for most traits,
> it's both.
> 
> The hard coded (nature) part of your AGI is about 300M lines of code,
> doable for a big company for $30 billion but probably not by you
> working alone. And then you still need a 10 petaflop computer to run
> it on, or several billion times that to automate all human labor
> globally like you promised your simple universal learner would do by
> next year.
> 
> Or maybe you could automate the software development. It's happened
> once, right? All it took was 10^48 DNA base copy operations on 10^37
> bases over 3.5 billion years on planet sized hardware that uses one
> billionth as much energy per operation as transistors.
> 
> I believe AGI will happen because it's worth $1 quadrillion to
> automate labor and the technology trend is clear. We have better way
> to write code than evolution and we can develop more energy efficient
> computers by moving atoms instead of electrons. It's not magic. It's
> engineering.
> 
> --
> -- Matt Mahoney, mattmahone...@gmail.com

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-Ma1dd71b4e4781df28ef394d6
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to