
Kate Darling: Social Robotics | Lex Fridman Podcast #98
Lex Fridman (host), Kate Darling (guest)
In this episode of Lex Fridman Podcast, featuring Lex Fridman and Kate Darling, Kate Darling: Social Robotics | Lex Fridman Podcast #98 explores how Our Emotional Bonds With Robots Shape Ethics, Rights, And Society Lex Fridman and MIT researcher Kate Darling discuss social robotics, focusing on how humans emotionally respond to robots and what that means ethically and legally. They explore labor and automation myths, the analogy between robots and animals rather than humans, and whether mistreating robots can affect human empathy. The conversation covers anthropomorphism as both a powerful design tool and a risk for manipulation, including issues around sex robots, military robots, and home assistants. They close by debating regulation, data privacy, IP, and the near‑term prospects for truly compelling social robots in the home.
How Our Emotional Bonds With Robots Shape Ethics, Rights, And Society
Lex Fridman and MIT researcher Kate Darling discuss social robotics, focusing on how humans emotionally respond to robots and what that means ethically and legally. They explore labor and automation myths, the analogy between robots and animals rather than humans, and whether mistreating robots can affect human empathy. The conversation covers anthropomorphism as both a powerful design tool and a risk for manipulation, including issues around sex robots, military robots, and home assistants. They close by debating regulation, data privacy, IP, and the near‑term prospects for truly compelling social robots in the home.
Key Takeaways
Robots are more like animals and tools than human replacements.
Darling argues that framing robots as one‑to‑one human replacements (for jobs or relationships) is misleading; historically, society has used and emotionally bonded with animals as workers, weapons, and companions, and we’re starting to treat robots in similar, varied ways.
Get the full analysis with uListen AI
Anthropomorphism is powerful, easy to trigger, and double‑edged.
People readily project life onto simple machines (e. ...
Get the full analysis with uListen AI
How we treat robots may reveal and train our empathy or cruelty.
Even though today’s robots are “dumber than insects,” Darling’s work suggests that willingness to hurt or protect robots may correlate with broader empathy tendencies, implying that abuse of robots could either offer a harmless outlet or reinforce cruel behavior—we don’t yet know which.
Get the full analysis with uListen AI
Trolley problems show our ethics are hard to encode, not solve.
Autonomous‑vehicle debates expose that human moral intuitions are inconsistent and context‑dependent. ...
Get the full analysis with uListen AI
Social robots can fill emotional gaps without replacing human bonds.
Darling sees space for robots as companions—like pets or specialized confidants—especially for loneliness or needs humans can’t practically meet, but she resists the idea that their primary function should be as human romantic partners or “better spouses.”
Get the full analysis with uListen AI
Current social robots fail less on emotion than on usefulness and expectations.
Companies like Jibo and Anki showed that people can bond deeply with social robots, but they struggled to offer a compelling, everyday ‘killer app’ that justified their cost, especially against inflated sci‑fi expectations of what robots should already be able to do.
Get the full analysis with uListen AI
Data‑driven AI and IP regimes are misaligned with societal needs.
Modern AI depends on massive data collection whose harms are diffuse and societal (e. ...
Get the full analysis with uListen AI
Notable Quotes
“The robots are not even as smart as insects right now.”
— Kate Darling
“We’ve always protected those things that we relate to the most.”
— Kate Darling
“I don’t think empathy is a zero-sum game; it’s a muscle you can train.”
— Kate Darling
“When you have to program ethics into machines, you realize our ethical rules are much less programmable than we thought.”
— Kate Darling
“I think it’s so boring to think about recreating things we already have when we could create something that’s different.”
— Kate Darling
Questions Answered in This Episode
If our empathy is easily extended to machines, what safeguards should we build to prevent that empathy from being commercially or politically exploited?
Lex Fridman and MIT researcher Kate Darling discuss social robotics, focusing on how humans emotionally respond to robots and what that means ethically and legally. ...
Get the full analysis with uListen AI
Should society grant robots any legal protections based on how humans feel about them, even if the robots themselves lack consciousness or sentience?
Get the full analysis with uListen AI
How should designers balance making robots emotionally compelling with ensuring users still treat them as tools where safety and clear judgment are crucial (e.g., in the military)?
Get the full analysis with uListen AI
What kinds of data or experiments would actually tell us whether abusing robots decreases empathy toward humans and animals—or simply provides a harmless outlet?
Get the full analysis with uListen AI
If we accept that ethics can’t be fully encoded into rules, what is a realistic, responsible approach for governing autonomous systems like self-driving cars?
Get the full analysis with uListen AI
Transcript Preview
The following is a conversation with Kate Darling, a researcher at MIT, interested in social robotics, robot ethics, and generally how technology intersects with society. She explores the emotional connection between human beings and life-like machines, which for me, is one of the most exciting topics in all of artificial intelligence. As she writes in her bio, she's a caretaker of several domestic robots, including her Pleo dinosaur robots named Yochai, Peter, and Mr. Spaghetti. She's one of the funniest and brightest minds I've ever had the fortune to talk to. This conversation was recorded recently, but before the outbreak of the pandemic. For everyone feeling the burden of this crisis, I'm sending love your way. This is the Artificial Intelligence podcast. If you enjoy it, subscribe on YouTube, review it with five stars on Apple Podcasts, support it on Patreon, or simply connect with me on Twitter @lexfridman, spelled F-R-I-D-M-A-N. As usual, I'll do a few minutes of ads now and never any ads in the middle that can break the flow of the conversation. I hope that works for you and doesn't hurt the listening experience. Quick summary of the ads. Two sponsors: Masterclass and ExpressVPN. Please consider supporting the podcast by signing up to Masterclass at masterclass.com/lex and getting ExpressVPN at expressvpn.com/lexpod. This show is sponsored by Masterclass. Sign up at masterclass.com/lex to get a discount and to support this podcast. When I first heard about Masterclass, I thought it was too good to be true. For $180 a year, you get an all-access pass to watch courses from, to list some of my favorites, Chris Hadfield on space exploration; Neil deGrasse Tyson on scientific thinking and communication; Will Wright, creator of SimCity and Sims, love those games, on game design; Carlos Santana on guitar; Garry Kasparov on chess; Daniel Negreanu on poker; and many more. Chris Hadfield explaining how rockets work and the experience of being launched into space alone is worth the money. By the way, you can watch it on basically any device. Once again, sign up on masterclass.com/lex to get a discount and to support this podcast. This show is sponsored by ExpressVPN. Get it at expressvpn.com/lexpod to get a discount and to support this podcast. I've been using ExpressVPN for many years. I love it. It's easy to use. Press the big power-on button, and your privacy's protected. And, if you like, you can make it look like your location's anywhere else in the world. I might be in Boston now, but I can make it look like I'm in New York, London, Paris, or anywhere else. This has a large number of obvious benefits. Certainly, it allows you to access international versions of streaming websites like the Japanese Netflix or the UK Hulu. ExpressVPN works on any device you can imagine. I use it on Linux, shout-out to Ubuntu 2004, Windows, Android. But it's available everywhere else too. Once again, get it at expressvpn.com/lexpod to get a discount and to support this podcast. And now, here's my conversation with Kate Darling. You co-taught robot ethics at Harvard. What are some ethical issues that arise in the world with robots?
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome