> -Original Message-
> From: Russ Hurlbut via AGI
>
> 1. Where do you lean regarding the measure of intelligence? - more towards
> that of Hutter (the ability to predict the future) or towards
> Winser-Gross/Freer
> (causal entropy - soft of a proxy for future opportunities; ref
> https:
> -Original Message-
> From: Matt Mahoney via AGI
>...
Yes, I'm familiar with these algorithmic information theory *specifics*. Very
applicable when implemented in isolated systems...
> No, it (and Legg's generalizations) implies that a lot of software and
> hardware
> is required and
John et al
Is there a truth in all of this? If you cannot see it, then you cannot see it.
Einstein could see it, how the universe operated, except for what happened
beyond the speed of light.
Can you see the consciousness at work? Can you sense and immerse in it? Can you
hear the myriad of mes
John -
Thanks for a refreshingly new discussion for this forum. Just as you
describe, it is quite interesting to see how seemingly disparate tracks
can be combined and guided onto the same course. Accordingly, your
presentation has brought to mind similar notions that appear to fit
somewhere
On Mon, Sep 10, 2018 at 8:10 AM wrote:
> Why is there no single general compression algorithm? Same reason as general
> intelligence, thus, multi-agent, thus inter agent communication, thus
> protocol, and thus consciousness.
Legg proved that there are no simple, general theories of prediction,
John
At a quantum level of electro-magnetic force activity (thinking of the graviton
in action) - all 4 forces obviously entangled - there seems to be no plausible
reason why the human brain and the earth could not resonate collectively as a
communication beacon with the universe. Refer Harame
Nanograte,
> In particular, the notion of a universal communication protocol. To me it
> seems to have a definite ring of truth to it.
It does doesn't it?!
For years I've worked with signaling and protocols lending some time to
imagining a universal protocol. And for years I've thought about a
Matt,
Zoom out. Think multi-agent not single agent. Multi-agent internally and
externally. Evaluate this proposition not from first-person narrative and it
begins to make sense.
Why is there no single general compression algorithm? Same reason as general
intelligence, thus, multi-agent, thus i
Consciousness is what thinking and perception feels like. Feelings are
mental states that modify behavior. We already can model this crudely in
reinforcement learners. Thinking evolved to feel good because it motivates
behavior that increases reproductive fitness. You fear death, which would
cause
For starters, I' d like to see a collection of reputable, academic works
defining consciousness. Having said that, I'm enjoying the sharing of ideas and
cooking-class banter. In particular, the notion of a universal communication
protocol. To me it seems to have a definite ring of truth to it. P
Matt:
> AGI is the very hard engineering problem of making machines do all the things
>that people can do.
Artificial people might be a path to AGI but not really AGI...
And I'm not the one originally saying consciousness is the magic ingredient.
Nature is :)
John
---
I'll take jargon salad over buzzword soup any day.
On Sun, Sep 9, 2018 at 3:26 PM Matt Mahoney via AGI
wrote:
> Recipe for jargon salad.
>
> Two cups of computer science.
> One cup mathematics.
> One cup electrical engineering.
> One cup neuroscience.
> One half cup information theory.
> Four ta
12 matches
Mail list logo