
Humanoids Cost as Much as an SUV Now | Nikhil Kamath x Brett Adcock | WTF Online Ep 2
Nikhil Kamath (host), Brett Adcock (guest), Nikhil Kamath (host)
In this episode of Nikhil Kamath, featuring Nikhil Kamath and Brett Adcock, Humanoids Cost as Much as an SUV Now | Nikhil Kamath x Brett Adcock | WTF Online Ep 2 explores humanoid robots near consumer costs: data, safety, and autonomy race Brett Adcock describes his path from software (Vettery) to electric aviation (Archer) and now humanoid robotics at Figure, arguing humanoids are the “ultimate general-purpose machine” because the world is built for the human form.
Humanoid robots near consumer costs: data, safety, and autonomy race
Brett Adcock describes his path from software (Vettery) to electric aviation (Archer) and now humanoid robotics at Figure, arguing humanoids are the “ultimate general-purpose machine” because the world is built for the human form.
He breaks down Figure’s robot stack—electric actuators, torso battery + CPU/GPU compute, multi-camera vision, force/torque sensing—and explains why deep learning is essential given the enormous state/action space of a 40-joint body.
A core theme is data: Figure trains on real human demonstrations and fleet learning, believes real-world physical interaction is a larger untapped dataset than the internet, and is building a large-scale “robot pretraining set” (Project Go Big).
They discuss near-term commercialization (BMW pilot) vs. home use, safety around kids, competing approaches in the industry, China’s perceived lead, Figure’s decision to end the OpenAI partnership, and how ubiquitous humanoids could collapse costs of goods/services and reshape jobs and purpose.
Key Takeaways
Humanoids are a full-stack problem—hardware and AI must be co-designed.
Adcock argues you can’t win with “hardware-only” bots or “software-only” autonomy; general-purpose performance requires tight integration of actuators, sensing, embedded compute, and learning systems that work end-to-end on the robot.
Get the full analysis with uListen AI
Deep learning unlocked humanoids because the state space is un-codeable.
With ~40 joints and continuous motion, the number of possible configurations explodes (he frames it as more states than atoms in the universe), making hand-coded approaches infeasible and pushing control toward neural networks.
Get the full analysis with uListen AI
High-frequency control forces significant onboard compute today.
Figure runs control and many neural nets on-robot because stable motion/manipulation needs ~200 Hz closed-loop updates; cloud links are too slow for that layer, though higher-level planning can increasingly move offboard.
Get the full analysis with uListen AI
Vision is camera-first, similar to Tesla-style perception rather than lidar-heavy stacks.
Figure’s robots “reason in pixel space” using multiple cameras (including palms/hands and head), enabling egocentric perception and closed-loop manipulation without lidar, paired with joint-state inputs and action outputs.
Get the full analysis with uListen AI
Real-world physical data is the moat; synthetic and internet text aren’t enough for embodiment.
He claims the internet’s text tokens are effectively “tapped out,” while the physical world offers richer, larger data. ...
Get the full analysis with uListen AI
Transfer learning across tasks is surprisingly strong—and central to scaling.
Adcock highlights that training on one task (e. ...
Get the full analysis with uListen AI
Safety in homes—especially around kids—is not solved yet, even for proponents.
Despite testing a robot in his home, he would not let it roam freely with young children for “hours and weeks” yet; he frames home-grade autonomy as requiring a much higher safety track record.
Get the full analysis with uListen AI
Notable Quotes
““Would you leave your kids with your humanoid?””
— Nikhil Kamath
““We’re not there today. Like, I would not let my… robot roam free for hours and weeks right now with my young kids.””
— Brett Adcock
““You can’t solve this with code. You have to use advanced AI, like neural nets.””
— Brett Adcock
““The world was designed for humans… so our belief here is that the humanoid is… the ultimate general-purpose machine.””
— Brett Adcock
““I just don’t believe it.””
— Brett Adcock
Questions Answered in This Episode
Project Go Big: what exact modalities (video, depth proxies, torque/force, audio, language prompts) are you capturing, and how will you label/structure it to be “internet-scale” for robotics?
Brett Adcock describes his path from software (Vettery) to electric aviation (Archer) and now humanoid robotics at Figure, arguing humanoids are the “ultimate general-purpose machine” because the world is built for the human form.
Get the full analysis with uListen AI
On BMW: what task is Figure 02 doing on the line, what percent is autonomous vs supervised intervention, and what’s been the top failure mode (hardware, perception, planning, grasping)?
He breaks down Figure’s robot stack—electric actuators, torso battery + CPU/GPU compute, multi-camera vision, force/torque sensing—and explains why deep learning is essential given the enormous state/action space of a 40-joint body.
Get the full analysis with uListen AI
You criticize teleoperation demos as “deceiving.” What autonomy benchmarks would you propose so the public can compare humanoid capabilities fairly across companies?
A core theme is data: Figure trains on real human demonstrations and fleet learning, believes real-world physical interaction is a larger untapped dataset than the internet, and is building a large-scale “robot pretraining set” (Project Go Big).
Get the full analysis with uListen AI
Hands are a major bet for Figure. Where is the line between “human-like dexterity” and “sufficient industrial dexterity,” and how does that affect cost and reliability?
They discuss near-term commercialization (BMW pilot) vs. ...
Get the full analysis with uListen AI
Home-first is a reversal of your earlier view. What specific capability milestones made you flip—navigation, deformable manipulation, long-horizon planning, or safety instrumentation?
Get the full analysis with uListen AI
Transcript Preview
[upbeat music] Let me know when you guys can start. Ready? [upbeat music] Would you leave your kids with your humanoid?
We're not there today. Like, I would not let my, like, robot roam free for hours and weeks right now with my young kids.
Everybody talks about China is so far ahead in robotics. Why is that?
Yeah, I just don't believe it. [upbeat music]
Hi, Brett. Lovely to meet you. Uh, what's behind you? Maybe we can start there.
Yeah. Um, so I'm at, I'm at Figure offices. Uh, we design humanoid robots. So these are robots that, you know, like, kind of like move around the world like humans: legs, arms, hands. Um, behind me is our first-generation robot, second and third-generation, uh, humanoid robots here at Figure.
Okay. So tell us a bit about yourself, Brett. Where did you begin? Where were you born? How did you come to build a humanoid?
Yeah. Um, so I was born in the Midwest, here, uh, in, in the US, on a third-generation farm, actually. [chuckles] We, um, we farmed corn and soybeans. Um, and then b- basically, since, like, middle school and high school, I started working on technology businesses, in... mostly in software. Uh, and that led to basically, [clears throat] basically now twenty years building tech companies. Um, I spent about ten years in software. Uh, ended up starting a company in 2012 called Vettery, and it was an A-- it was an AI marketplace for recruiting. The goal was to basically help, uh, candidates find jobs and employers find great talent, and that, that whole process is extremely broken and hard. I ended up selling Vettery, uh, for a little over a hundred million in 2017 to the Adecco Group, which is, like, the largest recruiting company in the world. After that, I, I really wanted to get into hardware, so I started a company called Archer Aviation, uh, in 2018, and what we do is we build electric aircraft, electric vertical take-off and landing aircraft. So it's, it's kind of like a hel- it takes off kind of like a helicopter, [clears throat] so you can, like, put them in cities, and it flies a little bit like a, like an airplane, and it's full- fully electric powertrain. And the goal is to basically help move traffic that it's like, you know, in cities and move into the air, so folks can get around a lot easier. And yeah, ba- basically designed five generations of aircraft so far at Archer. Our current aircraft is Midnight. It's a piloted four-passenger aircraft that we're, we're flying now, trying to get that certified in the US airspace with the FAA. Ended up taking that company public about four years ago, five years ago, and then, yeah, three year- three years ago, I started Figure. So we, we design, we, we, we design and manufacture humanoid robots. And the goal, the goal here at Figure is to be able to do everything a human can in the physical world.
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome