Nice, Curt!  Nice!

 

Nick Thompson

 <mailto:thompnicks...@gmail.com> thompnicks...@gmail.com

 <https://wordpress.clarku.edu/nthompson/> 
https://wordpress.clarku.edu/nthompson/

 

From: Friam <friam-boun...@redfish.com> On Behalf Of Curt McNamara
Sent: Tuesday, August 24, 2021 1:11 PM
To: The Friday Morning Applied Complexity Coffee Group <friam@redfish.com>
Subject: Re: [FRIAM] Eternal questions

 

A sciencey view of emotions and where they are located in the body:

https://www.pnas.org/content/111/2/646

 

Search emotions map for visuals that show a ton of different emotions.

 

      Curt

 

On Tue, Aug 24, 2021 at 11:54 AM uǝlƃ ☤>$ <geprope...@gmail.com 
<mailto:geprope...@gmail.com> > wrote:

I suppose my problem is that I *do* think we can find "the sadness" inside the 
brain ... well, not inside the brain, exactly, but inside the *body* ... well, 
not *inside* the body but peri-body, localized *at* the body, but extending out 
into the body's context a bit.

Just like with one's thumb, sadness comprises a dynamic mesh of interlinked 
feedback loops. And that dynamic mesh of feedback is *part* of a larger mesh of 
such loops. (Re: "putting it in a robot" - cf types of attention: soft, hard, 
self, etc.) Some of those loops *look at* the sadness cluster, register that 
cluster as "other" in some sense ... just like how I can imagine chopping off 
my thumb and tossing that object into the woods.

Because I do this all the time, it would blow my mind if others did not also do 
it. I particularly do it with fear. Wake up startled. Think maybe there's 
someone in the house. Grab the bat. As I'm walking up the stairs, I *reflect* 
on my registered, now objectified fear. And, in doing so, wiggle my way out of 
the grip of that fear and think more tactically about how to behave if there's 
a human-shaped shadow in the living room.

I do the same thing with pain, particularly my chronic back pain, but also with 
things like stubbed toes. It's fscking hilarious how much that hurts ... 
ridiculously over-emphasized pain for what has happened. To not step outside 
the pain and laugh out loud would be weird. It's just way too funny how much 
that hurts.

I welcome a lesson in how misguided I am, here.

On 8/24/21 8:36 AM, Eric Charles wrote:
> So.... This is JUST a question of whether we are having a casual conversation 
> or a technical one, right? Certainly, in a casual, English-language 
> conversation talk of "having" emotions is well understood, and just fine, for 
> example "Nick is /having /a fit, just let him be." (I can't speak for other 
> languages, but I assume there are many others where that would be true.) 
> 
> If we were, for some reason, having a technical conversation about how 
> the/Science of //Psychology/, should use technical language, then we /might 
> /also come to all agree that isn't the best way to talk about it. 
> 
> In any case, the risk with "have" is that it reifies whatever we are talking 
> about. To talk about someone /having /sadness, leads naturally --- 
> linguistically naturally --- in English --- to thinking that sadness is /a 
> thing/ that I could find if I looked hard enough. It is why people used to 
> think (and many, many, still do) that if we just looked hard enough at 
> someone's brain, we would find /the sadness/ inside there, somewhere. That is 
> why it is dangerous in a technical conversation regarding psychology, because 
> that implication is wrong-headed in a way that repeatedly leads large swaths 
> of the field down deep rabbit holes that they can't seem to get out of. 
> 
> On the one hand, I /have /a large ice mocha waiting for me in the fridge. On 
> the other hand, this past summer I /had /a two-week long trip to California. 
> One is a straightforward object, the other was an extended activity I engaged 
> in. When the robot-designers assert that their robot "has" emotions, which do 
> they mean? Honestly, I think they don't mean either one, it is a marketing 
> tool, and not part of a conversation at all. As such, it does't really fit 
> into the dichotomy above, and is trying to play one off of the other. They 
> are using the terms "emotions and instincts" to mean something even less than 
> whatever Tesla means when they say they have an autodrive that for sure still 
> isn't good enough to autodrive.
> 
> What the robot-makers mean is simply to indicate that the robot will be a bit 
> more responsive to certain things that other models on the market, and 
> /hopefully /that's what most consumers understand it to mean. But not all 
> will... at least some of the people being exposed to the marketing will take 
> it to mean that emotion has been successfully put somewhere inside the robot. 
> (The latter is a straightforward empirical claim, and if you think I'm wrong 
> about that, you have way too much faith in how savvy 100% of people are.) As 
> such, the marketing should be annoying to anti-dualist psychologists, who see 
> it buttressing /at least some/ people's tendency to jump down that rabbit 
> hole mentioned above.
> <mailto:echar...@american.edu <mailto:echar...@american.edu> >

-- 
☤>$ uǝlƃ

- .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
FRIAM Applied Complexity Group listserv
Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam 
<http://bit.ly/virtualfriam> 
un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives: http://friam.471366.n2.nabble.com/

- .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
FRIAM Applied Complexity Group listserv
Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam
un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/
archives: http://friam.471366.n2.nabble.com/

Reply via email to