On Fri, Jul 12, 2024, 7:51 PM John Rose <johnr...@polyplexic.com> wrote:

> Is your program conscious simply as a string without ever being run? And
> if it is, describe a calculation of its consciousness.
>

If we define consciousness as the ability to respond to input and form
memories, then a program is conscious only while it is running.  We measure
the amount of consciousness experienced over a time interval as the number
of bits needed to describe the state of the system at the end of the
interval given the state at the beginning. A fluorescent molecule like
tryptophan has 1 bit of consciousness because fluorescence is not instant.
The molecule absorbs a photon to go to a higher energy state and releases a
lower energy photon nanoseconds or minutes later. Thus it acts as a 1 bit
memory device.

That's if you accept this broad definition of consciousness that applies to
almost every program. If you require that the system also experience pain
and pleasure, then tryptophan is not conscious because it is not a
reinforcement learning algorithm. But a thermostat is. A thermostat has one
bit of memory encoding "too hot" or "too cold" and acts to correct it.
Thus, the temperature is a form of negative reinforcement. Or it could be
described as positive, as the set temperature is the reward.

Once again, we see the ambiguity between pain and pleasure, both measured
in unsigned bits of behavior change. How can this be? I have pointed out
before that a state of maximum utility is indistinguishable from death. It
is a static state where no perception or thought is pleasant because it
would result in a different state. Happiness is not utility, but the rate
of increase of utility. Modern humans are less happy today than serfs in
the dark ages and less happy than animals, as measured by suicide and
depression rates.

On Mon, Jul 15, 2024 at 2:30 AM <immortal.discover...@gmail.com> wrote:
> First, no Matt actually ethics is rational in the sense that yes we are
supposed to (at first glance, keep reading) save all ants, bugs, molecules
and particles and help them be immortal.

Fear of suffering and death is not rational. It is a product of evolution.
Humans and other animals suffer because they have an amygdala, the part of
the brain responsible for fear, anxiety, and guilt. This is why fear of
being tortured is much worse (as measured in bits of behavior change) than
actually being tortured, and why negative utilitarians care more about
reducing suffering than increasing total happiness. But this can be
achieved by brain surgery. About 1% of humans are psychopaths. They have a
defective amygdala and don't respond to negative reinforcement as training.
They are not cold blooded killers. They are simply rational. As children,
they might torture animals not out of cruelty, but out of curiosity to
understand this strange emotion. They can only be trained using reward, not
punishment. Psychopaths don't suffer, and neither do any agents without an
amygdala, like insects or most programs.

Ethics isn't rational and can't be made rational. I care more about a sick
dog than 150,000 people dying every day.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T6510028eea311a76-M3007d06a636d8e6493efe693
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to