Skip to content
AnthropicAnthropic

Who let the robot dogs out?

We asked two teams of Anthropic researchers to program a robot dog. Neither team had any robotics expertise—but we let only one team use Claude. In the past, we’ve run simulated studies where Claude trained a robot dog. These helped us assess how Claude might contribute to AI research and development. Project Fetch was us trying something similar in practice. This project suggests that we’re not far from a world where frontier AI models can interact with previously-unknown pieces of hardware, even with non-experts at the helm. For more, read on: https://www.anthropic.com/research/project-fetch-robot-dog 0:00 Introduction: Why robotics? 0:30 The experiment 1:02 Phase 1: Fetch, manually 1:51 Phase 2: Fetch, programmatically 5:08 Phase 3: Fetch, autonomously 6:24 Results

Nov 11, 20257mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Anthropic tests Claude’s impact on robot-dog programming and autonomy

  1. Anthropic ran a one-day, three-phase robotics challenge to quantify whether Claude accelerates non-roboticists doing a sophisticated physical-world task.
  2. In Phase 1 (manual control), both teams could drive the robot dog to retrieve a ball, with only a small time advantage for the Claude team.
  3. In Phase 2 (programmatic control), Claude provided major leverage by helping the team find libraries, install dependencies, and connect a laptop to the robot—identified as the main bottleneck.
  4. In Phase 3 (autonomous fetch), the non-Claude team made partial progress on localization and ball detection but failed to integrate a full system, while the Claude team came close to an end-to-end autonomous solution.
  5. The results are framed as a near-term shift enabling more people to work effectively with robots and a long-term indicator that tasks requiring “human + model” today may become “model-only” tomorrow in the physical world.

IDEAS WORTH REMEMBERING

5 ideas

The biggest AI uplift came from unblocking robotics “plumbing,” not novel control theory.

Claude most noticeably accelerated the messy integration work—finding the right libraries, installing packages, and getting laptops talking to the robot—tasks that often dominate real robotics timelines.

Claude reduced the penalty of having little robotics experience.

Both teams were mostly software/research engineers new to robotics, but the Claude team could progress faster by delegating unfamiliar, detail-heavy setup and SDK exploration to the model.

Autonomy is the real frontier problem the experiment gestures toward.

Phase 3 required chaining perception, localization, planning/control, and execution; even with progress on individual components, the hard part was “knitting everything together” into a reliable loop.

Time savings were modest for simple tasks and large for complex tasks.

Manual joystick-style fetch showed only a small advantage, but as requirements shifted to programming and autonomy, the Claude team pulled ahead by hours, indicating benefits scale with complexity.

Hardware access and safe operation remain practical constraints.

The transcript shows near-collisions and an incident of the robot hitting someone, underscoring that faster iteration also raises the importance of safety checks, limits, and testing discipline.

WORDS WORTH SAVING

5 quotes

Robotics is sort of the clear entry point to how you have a mostly software system start having the ability to reach out into the real world.

Unknown

I've never really understood how reliant I am on Claude doing the menial work, finding all the nitty-gritty details I don't want to have to figure out.

Unknown

Probably the area where we saw the most uplift from Claude was just in the task of connecting to the robot.

Unknown

What today requires the combination of a person and an AI model, tomorrow is likely to just require the AI model.

Unknown

The effects of AI are not just going to be in software. They are going to be in hardware and in the physical world as well.

Unknown

Project Fetch experiment designRobot dog fetch task progressionManual vs programmatic vs autonomous controlTooling friction: hardware–software connectivityROS 2 SDK, libraries, dependency installationPerception and autonomy pipeline (detect, navigate, retrieve)Implications for AI’s move from software into robotics

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome