Well I see a problem. I cannot find a test to demonstrate that the
mysterious part of conscious experience that most of us sense because, from
my point of view, the science to detect it has not been developed and Matt
would not be able to find a scientific test to prove that it does not exist
because, from his point of view, there is nothing to test for.
Jim Bromer


On Sun, Sep 23, 2018 at 4:09 PM Jim Bromer <jimbro...@gmail.com> wrote:

> I do not mean to come across as being unpleasant or dismissing other
> people's crackpot ideas. It's a big pot and all are welcome to share.
> The constructivists argued that it was not enough to show that there was
> not evidence to support a proposition, you also had to find evidence to
> support the negation of the proposition. So I cannot come up with a way to
> find an experiment to draw evidence supporting the proposition that the
> mysterious part of conscious experience cannot be explained by contemporary
> science (because contemporary science does not have the tools to conduct
> such an experiment). But now, if Matt wants to prove that it is not real
> then he would have to come up with experiments to derive evidence that it
> is not. At the least he has to show that contemporary science can be used
> to begin testing his counter-hypothesis and find evidence that the evidence
> that his experiments could produce is profound if not overwhelming.
> Jim Bromer
>
>
> On Sun, Sep 23, 2018 at 3:29 PM Nanograte Knowledge Technologies via AGI <
> agi@agi.topicbox.com> wrote:
>
>> Matt
>>
>> One could easily argue that on the basis that you cannot scientifically
>> prove any of the absolute assertions you just made, that you cannot be
>> correct, but to what end? Let's unpack your perspective.
>>
>> 1) "Science doesn't explain everything." vs Science explains everything
>> it explains.
>> 2) => Philosophy explains why the universe exists vs Can you explain
>> which universe within science you are referring to?
>> 3) "It exists as a necessary condition for the question to exist." vs It
>> only exists because humankind thought of it as an original thought.
>> 4) "Science can explain...[X]" vs That's what we already stated and
>> implied.
>> 5) "Science explains that your brain runs a program." vs So does
>> philosophy. Are they both equally correct, therefore philosophy = science?
>>
>> Inter alia, you still did not explain anything much, did you?
>>
>> Rob
>> ------------------------------
>> *From:* Matt Mahoney via AGI <agi@agi.topicbox.com>
>> *Sent:* Sunday, 23 September 2018 4:02 PM
>> *To:* AGI
>> *Subject:* Re: [agi] E=mc^2 Morphism Musings...
>> (Intelligence=math*consciousness^2 ?)
>>
>> Science doesn't explain everything. It just tries to. It doesn't explain
>> why the universe exists. Philosophy does. It exists as a necessary
>> condition for the question to exist.
>>
>> Chalmers is mystified by consciousness like most of us are. Science can
>> explain why we are mystified. It explains why it seems like a hard problem.
>> It explains why you keep asking the wrong question. Math explains why no
>> program can model itself. Science explains that your brain runs a program.
>>
>> On Sat, Sep 22, 2018, 11:53 AM Nanograte Knowledge Technologies via AGI <
>> agi@agi.topicbox.com> wrote:
>>
>> Perhaps not explain then, but we could access accepted theory on a topic
>> via a sound method of integration in order to construct enough of an
>> understanding to at least be sure what we are talking about. Should we not
>> at least be trying to do that?
>>
>> Maybe it's a case of no one really being curious and passionate enough to
>> take the time to do the semantic work. Or is it symbolic of another
>> problem? I think it's a very brave thing to talk publicly about a subject
>> we all agree we seemingly know almost nothing about. Yet, we should at
>> least try to do that as well.
>>
>> Therefore, to explain is to know?
>>
>> Rob
>> ------------------------------
>> *From:* Jim Bromer via AGI <agi@agi.topicbox.com>
>> *Sent:* Saturday, 22 September 2018 6:12 PM
>> *To:* AGI
>> *Subject:* Re: [agi] E=mc^2 Morphism Musings...
>> (Intelligence=math*consciousness^2 ?)
>>
>> The theory that contemporary science can explain everything requires a
>> fundamental denial of history and a kind of denial about the limits of
>> cotemporary science. That sort of denial of common knowledge is ill suited
>> for adaptation. It will interfere with your ability to use scientific
>> method.
>> Jim Bromer
>>
>>
>> On Sat, Sep 22, 2018 at 11:42 AM Jim Bromer <jimbro...@gmail.com> wrote:
>>
>> Qualia is what perceptions feel like and feelings are computable and they
>> condition us to believe there is something magical and mysterious about it?
>> This is science fiction. So science has already explained Chalmer's Hard
>> Problem of Consciousness. He just got it wrong? Is that what you are
>> saying?
>> Jim Bromer
>>
>>
>> On Sat, Sep 22, 2018 at 11:07 AM Matt Mahoney via AGI <
>> agi@agi.topicbox.com> wrote:
>>
>> I was applying John's definition of qualia, not agreeing with it. My
>> definition is qualia is what perception feels like. Perception and feelings
>> are both computable. But the feelings condition you to believing there is
>> something magical and mysterious about it.
>>
>> On Sat, Sep 22, 2018, 8:44 AM Jim Bromer via AGI <agi@agi.topicbox.com>
>> wrote:
>>
>> There is a distinction between the qualia of human experience and the
>> consciousness of what the mind is presenting. If you deny that you have the
>> kind of experience that Chalmers talks about then there is a question of
>> why are you denying it. So your remarks are relevant to AGI but not the way
>> you are talking about them. If Matt says qualia is not real then he is
>> saying that it is imaginary because I am pretty sure that he experiences
>> things in ways similar to Chalmers and a lot of other people I have talked
>> to. There are people who have claimed that I would not be able to create an
>> artificial imagination. That is nonsense. An artificial imagination is
>> easy. The complexity of doing that well is not. That does not mean however,
>> that the hard problem of consciousness is just complexity. One is doable in
>> computer programming, in spite of any skepticism, the other is not.
>> Jim Bromer
>>
>>
>> On Sat, Sep 22, 2018 at 10:03 AM John Rose <johnr...@polyplexic.com>
>> wrote:
>>
>> > -----Original Message-----
>> > From: Jim Bromer via AGI <agi@agi.topicbox.com>
>> >
>> > So John's attempt to create a definition of compression of something
>> > complicated so that it can be communicated might be the start of the
>> > development of something related to contemporary AI but the attempt to
>> > claim that it defines qualia is so naïve that it is not really relevant
>> to the subject
>> > of AGI.
>> 
>> 
>> Jim,
>> 
>> Engineers make a box with lines on it and label it "magic goes here".
>> 
>> Algorithmic information theory does the same and calls it compression.
>> 
>> The word "subjective" is essentially the same thing.
>> 
>> AGI requires sensory input.
>> 
>> Human beings are compression engines... each being unique.
>> 
>> The system of human beings is a general intelligence.
>> 
>> How do people communicate?
>> 
>> What are the components of AGI and how will they sense and communicate?
>> 
>> They guys that came up with the term "qualia" left it as a mystery so
>> it's getting hijacked 😊 This is extremely common with scientific
>> terminology.
>> 
>> BTW I'm not trying to define it universally just as something
>> computationally useful in AGI R&D... but it's starting to seem like it's
>> central. Thus IIT? Tononi or Tononi-like? Not sure... have to do some
>> reading but prefer coming up with something simple and practical without
>> getting too ethereal. TGD consciousness is pretty interesting though and a
>> good exercise in some mind-bending reading.
>> 
>> John
>> 
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
>> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
>> participants <https://agi.topicbox.com/groups/agi/members> + delivery
>> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
>> <https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-M75fa1c2d76faee8ee1d7ee1e>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-M845ee5de30f9905aa17a7279
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to