Skip to content
Y CombinatorY Combinator

Apple Vision Pro: Startup Platform Of The Future?

In this episode of the Lightcone Podcast, YC Group Partners discuss the launch of the Apple Vision Pro and the potential of this new platform for new startups. This is a deep dive into the technical innovations Apple has made for this product, how this compares to the launch of the iPhone, and advice for founders interested in building in this space. Chapters (Powered by https://bit.ly/chapterme-yc) - 00:00 - Coming Up 01:26 - Diana AR/VR Startup 02:50 - Why AR? 05:14 - Eyes ability to focus 07:41 - Hardware vs Software 09:24 - VR/AR and Self-Driving cars 11:07 - Focus on productivity 13:25 - Eye tracking 15:26 - Meta SDK vs Apple Vision Pro SDK? 17:33 - Is Vision Pro an iPhone moment? 20:19 - Pre-mortem exercise 23:05 - Consumer social networks 24:12 - Should founders build on VR/AR tech? 27:09 - YC applications: VR 27:36 - Outro

Jared FriedmanhostDiana HuguestGarry TanhostHarj Taggarhost
Feb 20, 202427mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Apple Vision Pro: High-End AR Platform Poised For Startup Breakthroughs

  1. The episode explores Apple Vision Pro as a potential foundational platform for the next generation of startups, drawing parallels to early iPhone days and previous AR/VR attempts. Guests dissect why Apple's pass-through, camera-based AR approach is more tractable than optical AR and how it leverages a decade of investment in iPhone chips, sensors, and computer vision.
  2. They contrast Apple’s productivity-focused positioning with Meta’s gaming-centric strategy, arguing Vision Pro’s strength lies in high-information, professional use cases and replacing traditional screens. The conversation also highlights new UX primitives around eye tracking and spatial interfaces, suggesting major interaction breakthroughs are still to come from third‑party developers.
  3. From an investor and YC perspective, they debate whether this is an “iPhone moment” or “Newton moment,” emphasizing adoption curves, platform risk, and when it actually makes sense for founders to bet on Vision Pro. Ultimately, they advise backing founders who are deeply, irrationally committed to spatial computing and willing to build through the early, awkward phase of the platform.

IDEAS WORTH REMEMBERING

5 ideas

Pass-through AR lowers core technical barriers compared to optical AR.

By rendering the real world via high-resolution video (instead of complex light-field optics), Vision Pro turns many physics problems into software and compute problems, where Apple already has strong advantages in chips, sensors, and displays.

Vision Pro’s real power lies in understanding and augmenting the real world.

With LiDAR, multiple cameras, and eye tracking feeding a dedicated R1 coprocessor, the headset performs SLAM-style localization similar to self-driving cars, enabling precise spatial awareness that future apps can exploit.

Eye tracking is a new primary input primitive, analogous to multi-touch on iPhone.

Vision Pro uses foveated rendering and gaze-based selection, and Apple has codified early best practices in its Human Interface Guidelines—opening space for founders to invent entirely new interaction patterns beyond simple pinching and window dragging.

Initial killer apps are likely in high-value, high-information professional niches.

Use cases like trading desks, CAD, engineering, and complex monitoring—where people already use many screens—are more likely to pay for infinite spatial displays and dense visualization than casual consumers in the near term.

Major platform shifts usually take 5+ years to produce iconic startups.

Drawing on the iPhone timeline (Instacart, DoorDash, Uber arriving years after launch), the hosts argue we’re early in the Vision Pro cycle; mass adoption and truly native applications will lag the hardware by several years.

WORDS WORTH SAVING

5 quotes

The dream has always been to get to something like this… so the developers would write the code once and it would work across all devices.

Diana

You need to understand the real world in order to augment it… it’s starting to sound a lot like actually a technology of a self-driving car, but on a headset.

Diana

With the Vision Pro, they invested so much on eye tracking… I think it is the moment that we’re seeing with capacitive touch where Apple got it right for the iPhone.

Diana

If done well, this is going after the market cap of all screens that get sold.

Diana

We would never try and discourage founders from building stuff they just think is cool.

YC Partner (Garry/Jared paraphrased in discussion)

Technical architecture of Apple Vision Pro (pass‑through AR, sensors, custom silicon)Comparison between optical AR (HoloLens, Magic Leap) and video pass-through approachesProductivity and high-information workflows versus gaming as core use casesNew interaction paradigms: eye tracking, spatial UX, and human interface guidelinesHistorical analogies: iPhone adoption, app ecosystem evolution, and platform shiftsStartup and investor strategy: timing, platform risk, and founder mindsetDifferences between Meta’s SDK/game-engine DNA and Apple’s visionOS developer stack

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome