The reason I introduced "A reasonably objective definition of "zombiness"
is its vernacular use in ethology..." is precisely to *avoid* arguing over
the definition of "zombie" by introducing a different *sense* of "zombie"
than "philosophical zombie" -- one that is not only "reasonably objective"
but -- and this is important to "requirements" -- in the vernacular.
Clearly, there is *something* about the word "zombie" that people find
"unfriendly" even if they find that sense ineffable.  I'm effing
the "requirement".

On Fri, Nov 8, 2019 at 5:45 PM Matt Mahoney <[email protected]> wrote:

> Philosophy is arguing about the meanings of words. By "zombie" I mean a
> philosophical zombie. https://rationalwiki.org/wiki/Philosophical_zombie
>
> On the unrelated topic of friendly AI, we need to define what that means
> too. If you read my previous post on AGI requirements carefully, you will
> note that the extinction of humans and of all DNA based life falls within
> its scope without being explicitly unfriendly. I am open to discussion on
> whether this is a good idea or if the requirements need to be refined. (The
> issue is whether uploads are zombies. I contend there is no such thing).
>
> Avoid existential errors. Resist the temptation to skip too quickly to
> design.
>
> On Thu, Nov 7, 2019, 8:20 PM James Bowery <[email protected]> wrote:
>
>> See my LinkedIn post "A Leggian Approach to "Friendly AI
>> <https://www.linkedin.com/pulse/leggian-approach-friendly-ai-james-bowery>"
>> for background.
>>
>> My 2ยข:
>>
>> This is related to the "consciousness" confusion in the following sense:
>>
>> Matt Mahoney's reductio ad absurdum of of the sensibility of
>> "consciousness" relies on its standing in opposition to "zombiness" (is
>> that a word?).  A reasonably objective definition of "zombiness" is its
>> vernacular use in ethology to colorfully describe what, in "The Extended
>> Phenotype: The Long Reach of the Gene", Richard Dawkins describes as "Host
>> Phenotypes of Parasite Genes".
>>
>> Take, for example, this video of an altruistic cricket sacrificing his
>> life for his little friend <https://www.youtube.com/watch?v=Df_iGe_JSzI>.
>> Such "altruists" are frequently referred to as "zombies" in popular
>> scientific literature
>> <https://video.nationalgeographic.com/video/animals-source/0000016a-2c84-de00-a1fb-edd7b4940000>
>> .
>>
>> In evolutionary medical circles such "virulence" evolves to its "optimal"
>> level (ie: optimal virulence) through a battle between two opposing routes
>> into the next generation for replicators:
>>
>> Horizontal Transmission vs Vertical Transmission
>>
>> Vertical transmission occurs when a parasite's evolutionary fate is tied
>> to that of its host through reproduction -- in which case it becomes a
>> "mutualist".  An example we're all familiar with is the probable
>> coevolution of wolves with Cro Magnon resulting in the mutualistic
>> relationship between what we call "humans" and "dogs":
>>
>> Babies and Puppies
>>
>> Nothing could be more lovable, eh?
>>
>> Horizontal Transmission is, by contrast, illustrated by the
>> path-not-taken with those wolves who thought "Take the baby and run and
>> BREED!"
>>
>> The extreme form of "Take the baby and run and BREED!" is called
>> "parasitic castration" in which a parasite actually eats the genitals of
>> its host so as to divert resources to itself that would otherwise have gone
>> to the host's offspring.  The resulting evolutionary "zombie" is entirely
>> "conscious", at least in some sense.
>>
>> However, if we think about "turning the universe into paperclips" as the
>> extreme of "unfriendliness", it becomes apparent that there is an
>> intermediate station in which an "unfriendly AI" would do something like Turk
>> humans without the slightest regard for their reproductive viability
>> <https://www.mturk.com/>, ending the human species -- at least that
>> portion the unfriendly AI found useful enough to Turk.  Are humans who have
>> been Turked "conscious"?  Perhaps a more meaningful question is:  "Are they
>> are zombies?"
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/T251f13454e6192d4-M58fc1de2570abc931efa7885>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T251f13454e6192d4-Mabd7e426f664b3e02b832202
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to