top of page

Emotion and Understanding

In a traditional world, reason and passion shouldn’t co-exist. If there is passion, there can’t be reason.


Maybe…


We all know that our knowledge of something can be shaded by an emotional layer because of context. Learning something in an extremely fearful or sad context makes it more memorable than a positive experience. Having experienced an emotion for the first time introduces an understanding that was absent before. Truly understanding, almost anything, comes richly laden with emotion.



AI doesn’t “have” emotion. It can understand emotion as a definitional concept, but the experience is beyond it. There is no real joy or schadenfreude in the silicon (or whatever will be used). These things can be hard coded in, but therein lies a great danger.


How can embedding emotions be dangerous? Looking at a benign example helps. A brilliant play in a sporting event is likely to elicit a wide range of emotions, depending on perspective. Someone can even realize several emotions at once while witnessing the same event. This brilliant play can elicit rage, anger, disappointment, admiration, appreciation, awe, happiness, joy, euphoria, ecstasy, or jubilation. It depends on the fan. Are they a rabid fan losing the World Cup? Winning a World Cup (57 years)? A casual fan watching with a friend? The World Series for an October fan? A coach of the winning team? A play in the middle of a game? So many ways to look at the emotional reactions of people to the exact same event, which one is the right one? In building an artificial model, which one do you include? What if the coder is a Yankee fan and not a Red Sox fan? Without an encoded response how can we say that the AI agent understands the play?


And then there is the understanding that accompanies the experience. As a Doctor of Psychology, I have had hundreds of hours studying sadness, depression, and despair. I have taught students about these emotions and the devastating effects that they can have on people. However, an event took place that changed both my own perspective and the way I taught these emotions forever. Prior to this event, I thought I knew these emotions, but I didn’t. I had never really experienced them in any way.


In 2016 I had a benign tumor removed, very successfully, from my brain. The unsuccessful part had to do with where the tumor was located. The compressed areas included part of the right limbic system, part of the emotional regulation system that happens to specialize in negative emotions (why couldn’t it have been in an area for positive emotions). I quickly discovered what sadness, depression, and despair felt like. The epistemic record was suddenly enriched by the emotions attached to these letter strings. Depression and despair quickly became uncomfortable acquaintances. Thankfully the brain tissue largely decompressed, and their visits have become less frequent and intense. However, the emotions attached to the experiences gave me an understanding I only thought I had.


How can we trust the output, decision-making, automatic responses to, or regulation of critical aspects of our society to agents that will never experience emotion? Geoffrey Hinton believes AI will achieve (and possibly surpass) thinking abilities as powerful as, if not exceeding, human ability.


Without the ability to experience emotion, any understanding achieved by AI will be a hollow, definitional understanding based on artificial simulations.


Emotions set us apart and provide us with a richness of understanding that will never be surpassed. An understanding that drives us to know more. To feel more. To experience more.

I think that to say, “We are what we think” is as wrong as to say that “Memorization is learning”. Learning is remembering and remembering is wrapped in emotion. Too bad educators don’t understand this fully.

13 views0 comments

Recent Posts

See All

Comments


bottom of page