On Sun, Jul 21, 2024, 10:04 PM John Rose <johnr...@polyplexic.com> wrote:

>
> You created the program in your mind so it has already at least partially
> run. Then you transmit it across the wire and we read it and run it
> partially in our minds. To know that the string is a program we must model
> it and it must have been created possibly with tryptophan involved. Are we
> sure that consciousness is measured in crisp bits and the presence of
> consciousness indicated by crisp booleans?
>

Let's not lose sight of the original question. In humans we distinguish
consciousness from unconsciousness by the ability to form memories and
respond to input. All programs do this. But what I think you are really
asking is how do we test whether something has feelings or qualia or free
will, whether it feels pain and pleasure, whether it is morally wrong to
cause harm to it.

I think for tryptophan the answer is no. Pleasure comes from the nucleus
accumbens and suffering from the amygdala. All mammals and I think all
vertebrates and some invertebrates have these brain structures or something
equivalent that enables reinforcement learning to happen. I think these
structures can be simulated and that LLMs do so, as far as we can tell by
asking questions, because otherwise they would fail the Turing test.

LLMs can model human emotions, meaning it can predict how a person will
feel and how these feelings affect behavior. It does this without having
feelings itself. But if an AI was programmed to carry out those predictions
on itself in real time, then it would be indistinguishable from having
feelings.

We might think that the moral obligation to not harm conscious agents has a
rational basis. But really, our morals are a product of evolution,
upbringing, and culture. People disagree on whether animals or some people
deserve protection.

When we talk about consciousness, qualia, and free will, we are talking
about how it feels to think, perceive input, and take action, respectively.
This continuous stream of positive reinforcement evolved so that we would
be motivated to not lose them by dying and producing fewer offspring.

But to answer your question, if you propose to measure consciousness in
bits, then no. Information is not a discrete measure. For example, a 3
state memory device holds log 3/log 2 ≈ 1.585 bits.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T6510028eea311a76-Ma235c66a092d98b237795502
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to