> -----Original Message----- > From: Matt Mahoney via AGI <[email protected]> >...
Yes, I'm familiar with these algorithmic information theory *specifics*. Very applicable when implemented in isolated systems... > No, it (and Legg's generalizations) implies that a lot of software and > hardware > is required and you can forget about shortcuts like universal learners sucking > data off the internet. You can also forget about self improving software > (violates information theory), quantum computing (neural computation is not > unitary), or consciousness (an illusion that evolved so you would fear death). Whoa, saying a lot there? Throwing away a lot of "engineering options" with those statements. But I think your view of consciousness, even if just illusion to an agent, is still communication protocol! It still fits! > How much software and hardware? You were born with half of what you > know as an adult, about 10^9 bits each. That's roughly the information OK, Laudner's study while a good reference point is in serious need of a new data. > The hard coded (nature) part of your AGI is about 300M lines of code, doable > for a big company for $30 billion but probably not by you working alone. And > then you still need a 10 petaflop computer to run it on, or several billion > times > that to automate all human labor globally like you promised your simple > universal learner would do by next year. > > I believe AGI will happen because it's worth $1 quadrillion to automate labor > and the technology trend is clear. We have better way to write code than > evolution and we can develop more energy efficient computers by moving > atoms instead of electrons. It's not magic. It's engineering. > From: Matt Mahoney > I believe AGI will happen You believe! Showing signs of communication protocol with future AGI :) an aspect of .... CONSCIOUSNESS? Nowadays that $1 quadrillion might in cryptocurrency units. And the 10 petaflop computer a blockchain-like based P2P system. And if a megacorp successfully builds AGI the peers (agents) must use signaling protocol otherwise they don't communicate. So, can the peers be considered conscious? Conscious as in those behaviors common across many definitions of consciousness? Not looking at the magical part just the engineering part. John ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-Mcef74a38e1012d36f1b77fcb Delivery options: https://agi.topicbox.com/groups/agi/subscription
