[agi] Re: Proposed AI Tests

2019-09-23 Thread James Bowery
The paper is pointless only to those without comprehension. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T76e694bfafa7b5f7-Mbfb4c73110249a78809593b5 Delivery options: https://agi.topicbox.com/groups/agi/subscrip

Re: [agi] Re: Proposed AI Tests

2019-09-23 Thread Stefan Reich via AGI
That paper is pointless. We should implement the test case. By the end of tomorrow I will have an editor that solves the example. On Tue, 24 Sep 2019 at 02:44, James Bowery wrote: > That's a good start but the plural form of "example" needs extension to > plurality and the "simple" needs to be m

[agi] Re: Proposed AI Tests

2019-09-23 Thread James Bowery
That's a good start but the plural form of "example" needs extension to plurality and the "simple" needs to be made general.  Fortunately, this was published in the early 60s: "A Formal Theory of Inductive Inference Part I " and "Part II

[agi] Proposed AI Tests

2019-09-23 Thread Stefan Reich via AGI
I propose to test AI with simple examples. For example, code editing. User writes: a = b c = d e = f Then edits a = b into a := b. Then edits c = d into c := d. At this point, if you give the AI time to think, it has to propose editing the third line into e := f. There's your test. -- Stefa

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread James Bowery
In what way is coming up with the most predictive model given a set of observations not the essence of "is" as opposed to "ought"? -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T8d109c89dd30f9b5-M809a27d4193d355

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread Stefan Reich via AGI
> Compression is the "science" of AGI Or maybe not On Tue, 24 Sep 2019 at 01:52, James Bowery wrote: > Compression is the "science" of AGI and the value function parameterizing > sequential decision theory is its "engineering". So, yes, I did need to > explicate the value function of this test

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread James Bowery
Compression is the "science" of AGI and the value function parameterizing sequential decision theory is its "engineering".  So, yes, I did need to explicate the value function of this test, which is "minimize the size of the executable archive of *this* data set (as opposed to one of your own ch

Re: [agi] The future of AGI

2019-09-23 Thread James Bowery
I didn't read your argument prior to posting so to that extent it was unintentional.  I'm responding more to the generality of language in epistemology, thence to the future of AGI.  As I stated in my Quora answer, it is VERY strange that Google adopted *perplexity* as the model selection crite

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread Stefan Reich via AGI
OK so what they're looking for is a "good pattern finder" algorithm, one that is small enough that it carries its own weight in this benchmark. Yeah, it's nice and all, compression is a great thing. But maybe the algorithms grown here don't really translate to other AI problems all that well. Image

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread immortal . discoveries
Compression has to do with the neural model, representations. -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T8d109c89dd30f9b5-M33fc1d9d2b3ca223682d9ea3 Delivery options: https://agi.topicbox.com/groups/agi/subscr

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread immortal . discoveries
Haha Stefan I was thinking that too my bro I agree compression=intelligence. But even if maxed compressed, it won't be AGI. For example AGI has a reward system, passing this feat won't involve rewards. -- Artificial General Intelligence List: AGI Permalink

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread Stefan Reich via AGI
I don't quite understand that benchmark. When I have a compressor for text, how would that give me any kind of AI function? Like a machine that answers questions, recognizes things visually or what have you? Is this related to AI at all? On Mon, 23 Sep 2019 at 23:54, James Bowery wrote: > All an

Re: [agi] MindForth is the brain for an autonomous robot.

2019-09-23 Thread James Bowery
All anyone has to do to prove they've "solve AI" is best The Large Text Compression Benchmark .  As a fan of Chuck Moore, I eagerly await A. T. Murray's submission to that contest (but I'm not holding my breath). --

Re: [agi] The future of AGI

2019-09-23 Thread Rob Freeman
On Tue, Sep 24, 2019 at 9:34 AM James Bowery wrote: > The use of perplexity as model selection criterion seems misguided to me. > See my Quora answer to the question "What is the relationship between > perplexity and Kolmogorov complexity? >

Re: [agi] The future of AGI

2019-09-23 Thread James Bowery
The use of perplexity as model selection criterion seems misguided to me.  See my Quora answer to the question "What is the relationship between perplexity and Kolmogorov complexity?

Re: [agi] Simulation

2019-09-23 Thread korrelan
>From the reference/ perspective point of a single intelligence/ brain there are no other brains; we are each a closed system and a different version of you, exists in every other brain. We don’t receive any information from other brains; we receive patterns that our own brain interprets based so

Re: [agi] Simulation

2019-09-23 Thread John Rose
On Sunday, September 22, 2019, at 6:48 PM, rouncer81 wrote: > actually no!  it is the power of time.    doing it over time steps is an > exponent worse. Are you thinking along the lines of Konrad Zuse's Rechnender Raum?  I just had to go read some again after you mentioned this :) John

Re: [agi] Simulation

2019-09-23 Thread John Rose
On Sunday, September 22, 2019, at 8:42 AM, korrelan wrote: > Our consciousness is like… just the surface froth, reading between the lines, or the summation of interacting logical pattern recognition processes. That's a very good clear single brain description of it. Thanks for that. I don't thi

Re: [agi] Simulation

2019-09-23 Thread rouncer81
Yeh thats what im talking about thanks,  all permutations of space and time and im getting a bit confused... -- Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Taa86c5612b8739b7-M0de418647e9834420ee6c80f Delivery opt

Re: [agi] Simulation

2019-09-23 Thread immortal . discoveries
The butterfly effect is like a tree, there is more and more paths as time goes on. In that sense, time is the amount of steps. Time powers Search Space. Time=Search Space/Tree. So even if you have a static tree, it is really time, frozen. Unlike real life, you can go forward/backward wherever wa