How can robots make people care about its needs?

An emotive robot can create connections even without words.

The challenge

How can we make it easier and more natural for people and robots to work together?

The outcome

Mochi is a robot that uses simplified facial expressions and body movements that was created to explore how robots can expressively communicate with people. Initial test results showed that robots can be designed to inspire empathy and cooperation, even without words.

Artificial Intelligence and real interactions

It’s not your imagination – the technology you use every day is becoming smarter and more aware of you and your needs. From home thermostats that “learn” about your daily routines to voice-activated personal assistants, “smart” devices and artificial intelligence are being woven into the fabric of our everyday lives.

The next steps for robotic AI are the ability to have natural interactions and emotional connections with humans. This is important because, making people “feel” for a robot is essential to fostering a long-lasting human-robot co-habitation, communication, and collaboration. With this problem in mind, Yeliz Karadayi, an interaction designer at Sony Creative Center, started investigating ways that robots can connect with humans on an emotional level.

“The emotional element is incredibly valuable when developing robots that interact with humans. What would make a person feel like investing the time and effort to help a robot complete a task?” asked Yeliz. “How can we foster a relationship built around empathy and compassion, rather than pure subservience?”

Meet “Mochi”

Thus Mochi was born – a squat, rounded character that looks a lot like the Japanese rice cakes it’s named after. Mochi was designed to look mammalian (but not to look like any specific animal) to reinforce a specific type of bond and relationship that people have with animals. The designers created an “independent” robot — Mochi isn’t shy and is quite vocal about what he likes and doesn’t like.

The shape of the body was modeled in Rhino, a 3D modeling software, and molded with flexible expanding foam. Inside its foam body, Mochi’s “skeleton” is a simple rig that allowed its hands to wave up and down and its head to turn.

Using Kinect, a motion sensing device originally designed for the Microsoft XBox gaming system, with Unreal Engine, a game development software, and a Sony Pico Projector to project simple facial expressions on Mochi’s face, Yeliz set the stage for a simple experiment.

People were asked to interact with Mochi without instructions. The goal was to determine if they were able to complete a set of simple tasks by reacting to Mochi’s facial expressions and gestures.

Mochi makes a connection

While the test was simple, the results were promising. We learned that people were able to connect and empathize with Mochi, creating a desire to help and understand it’s gestural and physical language. Mochi’s one-of a-kind personality stimulated more curiosity and engagement.

“People found Mochi relatable. They gave Mochi agency and so they wanted to help it.” Yeliz observed. “When people can relate to robots, the ability to trust grows, allowing them to feel more comfortable about bringing robots into their lives.”

From one way to two way communications

Humans’ relationships with robots are evolving -- from the one-way communications that exist now, where information is input into robots, to two-way conversations where robots will need to have people naturally understand, respect and respond to their requests.

“In the future, robots in the city, office, and home will not exist purely to fulfill tasks, but to also be emotionally available, reliable, and comforting to people,” said Yeliz. “Our experiment with Mochi opens up a lot of opportunities for us to fit this concept of "character/personality design" and expressiveness into any kind of robot, to foster the kind of relationship that people want to have with their mechanical helpers.”