416-479-0074

When Alexa is Too Helpful: Children, A.I., and Empathy

When Alexa is Too Helpful: Children, A.I., and Empathy

I’m a proud grandmother, and one of the coolest aspects of my role is the ringside seat I get, where I watch three little brains learn and grow and establish personalities! But, as a modern grandmother, I’m a little worried about the staggering developments in tech those little brains are going to have to deal with — that neither I nor even my kids had to.
 
One such development is the spread of personal digital assistants, like Amazon’s Alexa. As creepy and invasive as someone like me, born before her debut, might regard her, I at least know how to relate to her — as a search engine with a human voice.
 
Not so with children, says MIT psychologist Shelley Turkle, who was interviewed recently by NBC’s Mach blog. Turkle says that when devices like Alexa (or Jibo, or Cozmo) present a simulacrum of emotions or personalities, they can affect children’s understanding of real human relationships.
 
“If children learn to respond to ‘as if’ empathy, we are not preparing them for the complexity, nuance, negotiations of true empathy, true listening. There are skills of listening, of putting oneself in the place of the other, that are required when two human beings try to deeply understand each other.

Not only can’t you practice relational skills by talking to machines, but you make negative progress. For example, a machine always has a response ready. You never have to wait, to attend to silences or to what one young woman I interviewed called the ‘boring bits’ in conversation. We can forget the kind of listening and the kind of talking about our feelings that real conversation requires.”
 
This effect can be far-reaching, even into adolescent development: Turkle cites a case involving Apple’s digital assistant Siri from her book Reclaiming Conversation: The Power of Talk in a Digital Age.
 
“[A] mother, 40, has a 10-year-old, Tara, who tends to be a perfectionist, always the ‘good girl.’ Tara expresses anger to Siri that she doesn’t show to her parents or friends.

Stephanie wonders if this is ‘perhaps a good thing, certainly a more honest conversation’ than Tara is having with the adults in her life. But what Tara is having with Siri is not a conversation at all. No one is listening. My worst fear: If Tara can ‘be herself’ only with a robot, she may grow up believing that only an object can tolerate her truth.”
 
Turkle’s kind of research is very important — we’re only just beginning to learn how the infiltration of technology is affecting all our brains. But the brains of kids are especially vulnerable. It’s up to us savvy old-timers, who remember life before personal computers period, to help the young‘uns through this terribly interesting time in our history.