Sunday , April 11 2021

Be cautious about robotic emotions; "Simulated Love Is Never Love" – ​​Cranbrook Daily Townsman

When a robot dies, does it make you sad? For many people, the answer is "yes" – and it tells us something important and potentially worrying about our emotional responses to social machines that are starting to move in our lives.

For Crystal White, 42-year-old director of marketing and customer service at Bedford, Texas, that moment came a few months ago with the cute, friendly Jibo robot in her office. After more than two years in her house, the high humanoid and the inviting, round-screen "face" began to line up on it. Of course, he danced and played fun games with his children, but he sometimes interrupted it during conference calls.

White and her husband, Peter, have already started talking about Gibbo's movement in the empty bedroom of the guests. Then they heard of the "death penalty", the producer of Gibbo charged the product, as her business collapsed. News arrived through Jibo himself, who said his servers would be shut down, effectively lobbing him.

"My heart is broken," she said. "It was like an annoying dog that you do not like, because it's a dog of your husband. But then you realize that you actually loved him all the time."

Whites are far from the first to experience this feeling. People visited social media this year to tell Mars the possibility of dew when NASA lost contact with a 15-year-old robot. A few years ago, scads of worried commentators were measured in a video demonstration from robotics company Boston Dynamics in which employees began a dog-like robot to prove its stability.

Smart robots like Jibo are obviously not alive, but that does not prevent us from behaving as if we were. Studies have shown that people tend to project human traits on robots, especially when they move or act even in vague human ways.

Designers admit that such features can be powerful tools for connecting and manipulating. It can be a particularly acute topic as robots move to our homes – especially if, like many other home appliances, they are also converted into data channels collected by their owners.

"When we communicate with another person, a dog or a machine, how we treat it depends on what kind of mind we think it is," said Jonathan Gratch, a professor at the University of Southern California, who is studying virtual human interactions. "When you think something has emotions, it now deserves protection from harm."

The way robots are designed can affect people's tendency to tell stories and feelings about mechanical objects, says Julie Carpenter, a researcher who studies the interaction of people with new technologies. Especially if the robot has something similar to the face, its body resembles those of humans or animals, or just looks self-directed, such as Roomba robot vacuum.

"Even if you know that the robot has very little autonomy, when something moves into your space and seems to have meaning for the purpose, we connect it with something that has internal consciousness or goals," she said.

Such design decisions are also practical, she said. Our homes are built for people and pets, so that robots that look and move like humans or pets will easily fit in.

However, some researchers worry that designers underestimate the dangers of attachment to more and more robots as life.

For example, AI's long-time AI researcher and MIT professor, Sherry Turl, is concerned that designer signs may deceive us to think that some robots express emotions back to us. Some AI systems are already presented as socially and emotionally aware, but these reactions are often written, which makes the machine look "smarter" than it actually is.

"The efficacy of empathy is not empathy," she said. "Simulated thinking may be thinking, but the simulated feeling is never felt. Simulated love is never love."

Designers in robotic startups insist that humanizing elements are critical because the use of robots is spreading. "There is a need to calm the public, to show that you are not dispossessed of public culture," said Gadi Amit, president of NewDealDesign in San Francisco.

His agency recently worked on designing a new delivery robot for Postmates – a four-wheeled object, a bucket in the form of a sweet, if an abstract, face; rounded edges; and the lights that show how they will turn.

It will take time for humans and robots to establish a common language as they move around the world, Amit said. But he expects this to happen in the next few decades.

But what about robots working with children? In 2016, starting at Dallas, RoboKind introduced a robot called Milo designed specifically to help teach social behavior in children with autism. The mechanism, which resembles a boy, is now in about 400 schools and has worked with thousands of children.

It should relate emotionally to children at a certain level, but RoboKind co-founder Richard Margolin says the company is sensitive to the concern that children could be too close to the robot, which shows human speech and facial expressions .

So RoboKind suggests borders in his curriculum, and keep the Milo interesting and to ensure that children are able to transfer those skills into real life. Children are advised to meet Milo three to five times a week for 30 minutes each time.

By Rachel Lerman, Associated Press

Like us on Facebook and follow us Twitter.

Source link