Skip to content
Lex Fridman PodcastLex Fridman Podcast

Kate Darling: Social Robotics | Lex Fridman Podcast #98

Kate Darling is a researcher at MIT, interested in social robotics, robot ethics, and generally how technology intersects with society. She explores the emotional connection between human beings and life-like machines, which for me, is one of the most exciting topics in all of artificial intelligence. Support this podcast by signing up with these sponsors: - ExpressVPN at https://www.expressvpn.com/lexpod - MasterClass: https://masterclass.com/lex EPISODE LINKS: Kate's Website: http://www.katedarling.org/ Kate's Twitter: https://twitter.com/grok_ PODCAST INFO: Podcast website: https://lexfridman.com/podcast Apple Podcasts: https://apple.co/2lwqZIr Spotify: https://spoti.fi/2nEwCF8 RSS: https://lexfridman.com/feed/podcast/ Full episodes playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4 Clips playlist: https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41 OUTLINE: 0:00 - Introduction 3:31 - Robot ethics 4:36 - Universal Basic Income 6:31 - Mistreating robots 17:17 - Robots teaching us about ourselves 20:27 - Intimate connection with robots 24:29 - Trolley problem and making difficult moral decisions 31:59 - Anthropomorphism 38:09 - Favorite robot 41:19 - Sophia 42:46 - Designing robots for human connection 47:01 - Why is it so hard to build a personal robotics company? 50:03 - Is it possible to fall in love with a robot? 56:39 - Robots displaying consciousness and mortality 58:33 - Manipulation of emotion by companies 1:04:40 - Intellectual property 1:09:23 - Lessons for robotics from parenthood 1:10:41 - Hope for future of robotics CONNECT: - Subscribe to this YouTube channel - Twitter: https://twitter.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/LexFridmanPage - Instagram: https://www.instagram.com/lexfridman - Medium: https://medium.com/@lexfridman - Support on Patreon: https://www.patreon.com/lexfridman

Lex FridmanhostKate Darlingguest
May 22, 20201h 12mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

How Our Emotional Bonds With Robots Shape Ethics, Rights, And Society

  1. Lex Fridman and MIT researcher Kate Darling discuss social robotics, focusing on how humans emotionally respond to robots and what that means ethically and legally. They explore labor and automation myths, the analogy between robots and animals rather than humans, and whether mistreating robots can affect human empathy. The conversation covers anthropomorphism as both a powerful design tool and a risk for manipulation, including issues around sex robots, military robots, and home assistants. They close by debating regulation, data privacy, IP, and the near‑term prospects for truly compelling social robots in the home.

IDEAS WORTH REMEMBERING

5 ideas

Robots are more like animals and tools than human replacements.

Darling argues that framing robots as one‑to‑one human replacements (for jobs or relationships) is misleading; historically, society has used and emotionally bonded with animals as workers, weapons, and companions, and we’re starting to treat robots in similar, varied ways.

Anthropomorphism is powerful, easy to trigger, and double‑edged.

People readily project life onto simple machines (e.g., naming Roombas, mourning bomb‑disposal robots), and designers can amplify this. It can enrich caregiving and companionship, but also distort how we use tools or enable emotional and commercial exploitation.

How we treat robots may reveal and train our empathy or cruelty.

Even though today’s robots are “dumber than insects,” Darling’s work suggests that willingness to hurt or protect robots may correlate with broader empathy tendencies, implying that abuse of robots could either offer a harmless outlet or reinforce cruel behavior—we don’t yet know which.

Trolley problems show our ethics are hard to encode, not solve.

Autonomous‑vehicle debates expose that human moral intuitions are inconsistent and context‑dependent. Crowd‑sourced preferences (like the Moral Machine) are informative but not a sound basis for law, because what people choose in a scenario isn’t necessarily what they’d want as a universal rule.

Social robots can fill emotional gaps without replacing human bonds.

Darling sees space for robots as companions—like pets or specialized confidants—especially for loneliness or needs humans can’t practically meet, but she resists the idea that their primary function should be as human romantic partners or “better spouses.”

WORDS WORTH SAVING

5 quotes

The robots are not even as smart as insects right now.

Kate Darling

We’ve always protected those things that we relate to the most.

Kate Darling

I don’t think empathy is a zero-sum game; it’s a muscle you can train.

Kate Darling

When you have to program ethics into machines, you realize our ethical rules are much less programmable than we thought.

Kate Darling

I think it’s so boring to think about recreating things we already have when we could create something that’s different.

Kate Darling

Ethical issues in robotics: responsibility, weapons, privacy, labor, and social impactHuman–robot emotional relationships and anthropomorphismParallels between animal rights, domestication, and future robot rightsViolence toward robots, empathy, and the “practice” of compassion or crueltySex robots, loneliness, and robots as companions rather than replacementsTrolley problems, Moral Machine, and encoding ethics into autonomous systemsBusiness, design, and IP challenges for social robots and AI platforms

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome