Can Robots Feel Emotions?

Dr. Suzanne Gildert is creating empathetic, human-like robots to live among us

By Breanna Mroczek

Dr. Suzanne Gildert has seen the dystopian movie Ex Machina, but she’s not afraid of a fate like protagonist Caleb—who is deceived by Ava, an artificial intelligence with a human-like body. Gildert has been professionally building robots for over five years, but her latest venture has her making robots that are more human-like and empathetic than ever before.

In 2014, Gildert founded the business Kindred AI with the goal of making smart and intelligent robots. She raised $50 million in financing for the company, grew it to over 60 employees, and opened offices in Toronto, Vancouver, and San Mateo. She was part of a team developing innovative artificial intelligence (AI) techniques for retail and ecommerce, in order to help companies sort customer orders faster. It was, by all accounts, successful, but it had become a company that made very industrial robots suited only for warehouses, and Gildert wanted to return to her long-term, big picture plans: create human-like machines, referred to as synths instead of robots, with human-like intelligence and emotions.

“I think we need to have more human intelligence in the machines in the world,” Gildert says. “I just truly believe that absolutely needs to exist. I don’t think you can make human-like intelligence without a human-like body. I think an AI learns and grows and matures based on the experiences it has in the physical world. If your AI doesn’t have a body, let alone a human-like one, it never has a chance of experiencing the world in a similar way to how humans experience it.”

Gildert amicably parted ways with Kindred AI—which still exists today—and, in January 2018, co-founded the company Sanctuary with Oliva Norton and Dr. Geordie Rose. Now a team of 16, Gildert and her colleagues at Sanctuary spent their first year creating and training a human-like AI protype: a female-presenting synth named Nadine. It wasn’t enough for their synths to just occupy a body; Gildert and her team want their synths to look realistic and, ideally, indistinguishable from humans.

“A synth is never going to relate to other humans in the same way we do unless it looks like us. It has to be able to take part in society to understand what it means to be human-like,” Gildert says.

Now, they’re working on another prototype with a different appearance and personality, and are looking for suitable industries and companies to partner with so they can start to integrate the technology into the workforce and begin to market their product.

ROBOTS AT WORK

Gildert thinks that synths can be used in many different industries and markets, but initially Sanctuary is looking at applications in health care, education, retail, hospitality, and elder care where one-on-one interactions are important. Initially, clients would be able to have a human control the synth to teach it speech patterns and gestures, until the synth learned these patterns and could interact on its own. She gives the example of a retail store—in a situation that sees only one customer enter a store every hour or so, a human would control the synth while having it interact with a customer, but when the store was empty the synth could go into an idle mode. “You’d be able to switch between the human control and the AI control very seamlessly,” Gildert says.

For those that are afraid of robots taking their jobs, Gildert thinks that, especially in these industries, there are opportunities for synths to augment, without replacing, the human workforce. For instance, doctors, nurses, and health care workers are typically overworked and Gildert thinks that their skills, intelligence, and expertise can be directed to certain situations while synths can perform menial, physical tasks like changing bed linens, preparing meals, and moving objects around.

“I believe the healthcare system is a place where we can add a lot of value. You could even have a social robot providing companionship to someone,” Gildert says. “A lot of workers don’t have a lot of time to just sit there and have a really nice calm, in depth conversation with patients. I believe that’s a place that we can add a lot of value, you could have a social robot providing companionship to someone whether they’re elderly, where they’re convalescing in a hospital, or whether they just want to improve their mental health by having a robot to talk to. So that’s an example where you might be able to augment things.”

At the SingularityU Canada Summit, Gildert will discuss the applications of this kind of AI in the marketplace. “I think a lot of people from the business world are very interested in where and how this technology is going to end up in the market,” Gildert says. “Right now, we’re looking at things these early prototypes might excel at, given some of the reduced functionality that they have. They are not physically able to do everything a human can yet. They can do a lot of cool things involving object manipulation and gesturing, but they can’t pick up heavy loads and manipulate fine objects dexterously yet. On the AI side, they are pretty capable in terms of the kinds of things you might expect from a voice assistant like Siri or Alexa. They can answer questions, they can help you set up appointments. They can have a simple conversation, but they’re not as smart as people are yet.”

YES, THIS ROBOT CAN FEEL

Gildert wants to teach her synths empathy because Sanctuary is going after markets—aforementioned industries like health care and hospitality—where social interaction is very important. She doesn’t think that synths can be properly accepted and integrated into society unless they are empathetic.

“A lot of industrial robots focus on very repetitive, physical tasks and there aren’t many robots that are focusing on social tasks, or that are even socially aware, and are interacting with people very closely. We want our robots to get to know people very well, understand social interactions, be very socially engaged, and be socially sensitive to other people. AIs won’t be human-like unless you focus on what it is that makes people tick. The only way an AI is going to be able to integrate into social interactions is by looking like a human, behaving like a human, and existing in a social setting in a very human-like way. Focusing on empathy just pushes us to develop a very human-like robot.”

While Sanctuary is still developing data to inform Nadine’s AI, Gildert has noticed some semblance of a personality. “You might say she’s funny because some of the AI systems we are running on her are very entertaining. Some of the chat bot content is quite amusing, so I like to think of her as a funny, slightly sassy, engaging, social robot.”

But not all synths are created equal: Sanctuary’s next prototype will resemble an older male, and each one created after that will be distinctive. “Our product is a little different because we want to be able to mass produce these robots, but we also want to make each one customizable and unique and have its own personality, and its own look and things like that,” Gildert says. “Really, we want these things to be more like people than like products.”

PHILOSOPHICAL DISRUPTION

While these synths aren’t poised to take our jobs, yet, Gildert doesn’t dismiss this fear completely. Though she sees the immediate application of her synths being about supporting the existing workforce, she does think that in the distant future robots will take over many more jobs—and this will force societies to think about the bigger picture and more philosophical ideas.

“The entire field of automation and developing technologies are driving this change,” Gildert says. “It’s not just human-like robots that are a threat. I see it as one of these forces that’s kind of hard to stop.”

“I think we need to start thinking about how people are going to have an income, and what people will do for a living in the future,” Gildert says. Maybe we’ll break this strong tie we have in our mind between working for an income and having a purpose in life. These two things have become conflated in the last two centuries. The idea that work is the main purpose of our life and is what gives our life meaning is something that we have been brainwashed into thinking by the industrial revolution, and it doesn’t have to be that way. If we had something like universal basic income, I don’t think people would lose all meaning in life. I think they would find volunteer opportunities, start their own businesses, go after more creative pursuits, and learn new things. I think breaking the tie between what we do and where our income comes from is really important. I’m actually in favour of some automation helping with that mindset transition.”

Even so, Gildert thinks that intelligent synths and humans will be able to live and work together. The entire point of developing an empathetic robot is so that it has applications beyond automation in the workforce; Sanctuary’s synths are meant to become a “full-fledged, digital person.” “Ideally, it would be curious, it would learn, it would interact with other people, it would have its own dreams and desires,” Gildert says. “I don’t think these synths are going to just be working slaves. I think AI will start to become more and more curious, more creative. We’re teaching it to have better-structured reward functions that will give it a set of motivation and goals. I do think we are moving into an era where AI will begin to want its own things. It’s a different kind of technology than what has come online now.”

WHY, ROBOT?

Naturally, Gildert’s work raises a lot of questions around ethics, not all of which she has the answers to at this stage. But, unlike SciFi stories would have us believe, she doesn’t think that introducing robots into society is mutually exclusive with chaos. “I think SciFi is a great tool, it shows this future that we could build, but that we don’t have to. We don’t have to build a world that’s dystopian. It’s not just robots that can make the world dystopian.” But one of the questions she is most frequently asked, when people see Nadine, is “What is she for?”

“We are programed to think of any machine that we build as being for something—think about a computer, a toaster, a television, they’re all for something,” Gildert says. “So when you see a new piece of technology you immediately wonder what it’s for. But I think this is a slightly different piece of technology because it’s supposed to eventually be like a person. So then you have to ask the question, ‘what is a person for?’ And that’s one of those very deep philosophical questions that we have been trying to ask for thousands of years and I find that kind of fun to talk about.”

Female Voice Audio
Can Robots Feel Emotions?
Male Full Audio Magazine
Can Robots Feel Emotion?
Female Full Audio Magazine
Can Robots Feel Emotion?
  • Social network