I want to try to have a more positive attitude about other people's crackpot ideas. It is taking me a few days to understand what people are saying or even why people are motivated to talk about the inexplicable experience of consciousness in an AI discussion group. But I will take some time off my hectic schedule of inconsequential meandering of ineptitude and procrastination to try to write down a few good ideas that I got from this and the other discussion going on in this group. By some weird, inexplicable coincidence I suddenly came to me that my own ideas seem to deserve being put in the lead. I said that I do not think that the mystery of conscious experience can be explained by current day material science but I think science may be better able to explain it some day. Matt said that we have to use philosophy to discuss it and Nanogate said that it takes courage to talk about theories about stuff like this. John said that perhaps there is a kind of emergence of consciousness (that would not occur in small scale systems). I really have a lot of work that I should be doing but I just wanted to say one more thing. I feel that the discussion about this might be just as well served by imagining some other kind of material interrelations than by inducting ideas about conjectures of other mysteries in physics and trying to smush it into this question. Jim Bromer On Tue, Sep 25, 2018 at 7:02 AM John Rose <[email protected]> wrote: > > -----Original Message----- > > From: Jim Bromer via AGI <[email protected]> > > > > > > But I still disagree with what you are saying. An artificial agent will not > > be able > > to experience qualia because it will lack that mysterious aspect of > > intelligence > > that allows us to sense certain things in the way we do. So it would be > > able to > > distinguish blue from red (as long as there was some kind of light to see > > the > > colors of > > course) but it would not experience haptic sensory input in a way that was > > fundamentally different from the way that it experiences red and blue. For a > > computer it is just data. It may be presented in different formats but > > there is > > no qualia (as I understand the concept.) It may be programmed to simulate > > the communication as if it were dealing with qualia, but it would be pure > > simulation. > > > > > > So qualia stands in contrast to propositional attitudes on the beliefs about > > experience but it also, as I understand it, stands in contrast to 'data' > > that may > > be transmitted by sensors (or the product of the computational analysis of > > that data). > > > > Jim, > > The more complex the qualia the more difficult to transmit the full thing? So > a qualia from the vantage point of another agent needs a label otherwise > known as a symbol and/or compression. Especially if the qualia is > uncomputable from another agent's perspective. The phrase "my sensation of > the color blue" communicated to you is a label that you personally decompress > and understand somewhat since you are another human and feel similar but not > exact. We can put labels on things uncomputable for transmission. Data as > well as instructions can be transmitted. > > So what does a transmitted symbol of a human qualia represent? It represents > something that is important for the system as a whole. Why? Because each of > us are individual sensors transmitting. How one feels is an individually > processed sensory impression, compressed for transmission into the wider > intelligence bandwidth of the agent group. > > Can a computer experience qualia exactly as a human? TBD. Can it experience > qualia with more complexity? Probably IMO at some point. Are less than human > qualia, qualia? That's nomenclature really... The emphasis related to > intelligence I'm saying is not on the individual experience but the system. > > John >
------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T9c94dabb0436859d-M7f7d029634a5b448b2717e73 Delivery options: https://agi.topicbox.com/groups/agi/subscription
