Skip to content Skip to sidebar Skip to footer

Source: 20 Minutes as of  12-03-2020

In “The Robots “Emotional”», Laurence Devillers explores the human-to-human relationshipTechnology

Project of emotion on a robot corresponds to the Need not to Stay Only.

By dint of simulating emotions and empathy, what place will machines have in our societies? This is one of the questions that Laurence Devillers, a researcher at Limsi-CNRS and professor of computer science and artificial intelligence   (AI) at Sorbonne University, is addressing in her book The “Emotional” Robots (Editions of the Observatory), published on Wednesday.

What is an “emotional robot” the title of your book?

“Emotional robot” is a bit like “artificial intelligence.” The oxymoron creates immediate ambiguity. In “robot” we think machine, and in “emotional” we think human. Of course, robots have no emotion. But we go further and further in the simulation we make of the living. For example, language, which is specific to humans, is simulated. But the word is strong, it is the way to love, to share… Perception and decision-making is the big difference between machines that ship AI. What kind of emotions are we capable of simulating today?

Robots are able to simulate a lot of emotions, but less to detect them. One synthesizes for a personality, or a voice, while in facial recognition [one of the main actresses in the recognition of emotions] we track all the variability. It’s hard to know when to generate a good emotion in front of a person. You have to understand his expressive behaviors. In front of a crying person, I can make a machine understand, interpret the signs and follow a statistical model that says, “In this case, you have to be empathetic.”

Will robots become future pets? You are interested in your pet because it is unpredictable. However, even if you put a little unpredictability in the machine, it will be easy to find it. All these machines, by dint of being so precise and caring, are likely to be a terrible boredom in the long run. Why do we project so much on robots?

This was theorized by researchers Byron Reeves and Clifford Nass of Stanford. When a machine speaks to us, whether it is a computer or a phone, the human capabilities are projected onto it. This is our natural way of functioning. “has met the need to never be alone.

Show CommentsClose Comments

Leave a comment

News ORS © 2020. All Rights Reserved.