Aakash GuptaDesigning With AI With Designers of Figma & Codex
At a glance
WHAT IT’S REALLY ABOUT
How Figma and Codex enable fluid AI-driven design workflows
- The guests argue “code vs canvas” is a false dichotomy and that modern teams should fluidly move between exploratory canvas work and high-fidelity coded prototypes based on the job to be done.
- Ed demonstrates a bidirectional workflow where Codex snapshots UI states into Figma for pixel-perfect iteration, then uses a Figma component link to push changes back into code.
- They explain current fidelity limits (e.g., shaders, complex transitions) and how reliability improves with better models, better component naming, and alignment between design tokens and code tokens.
- Both describe a rapid cultural shift inside OpenAI and Figma: more designers (and even non-design roles) are prototyping, working in staging, and shipping small PRs to production.
- They predict roles will blur in tooling and execution, but not disappear in purpose—design, engineering, and PM remain distinct lenses while becoming more “total football” cross-functional in practice.
IDEAS WORTH REMEMBERING
5 ideasChoose code or canvas based on the problem stage, not ideology.
Use canvas for lateral exploration, collaboration, and storytelling artifacts; use code when you need real interactions, responsiveness, and production-adjacent truth. Expect to weave between both rather than follow a linear handoff.
Start higher-fidelity earlier because AI lowers the cost of “real.”
Historically teams started low-fi because high fidelity was expensive; now “low-fi” can be a functional wireframe in Codex that gets teams aligned faster and surfaces real constraints earlier.
Bidirectional interoperability reduces throwaway prototypes and handoff churn.
Ed’s workflow shows generating Figma frames from coded UI states for pixel-level refinement, then sending changes back into the codebase using a Figma link—turning design iteration into shippable work rather than documentation.
Fidelity gaps are real, but narrowing; use annotations and token alignment to mitigate.
Certain web-specific effects (shaders, advanced transitions) won’t fully translate to a static canvas, but teams can encode intent via annotations and improve consistency by mapping design tokens to CSS/code tokens.
Tool reliability depends heavily on “hygiene,” just like onboarding a human teammate.
Well-named components, clean libraries, and consistent token systems make agents far more effective, reducing misinference and rework. Treat the agent like a colleague who needs clear structure to perform.
WORDS WORTH SAVING
5 quotesIf developers have been accelerated, say, like 10X... designers have maybe been accelerated like 1.5 or 2X, so design can become the bottleneck.
— Ed Bayes
Finally we don't really have to choose... navigating seamlessly between both [code and canvas].
— Gui Seiz
It's like a really fun time to be a designer because your imagination really is the only upper limit.
— Ed Bayes
You're able to do stuff... and you have to think about, like, 'Okay, this is cool that I can, but then what should I?'
— Gui Seiz
Curiosity is gonna be the defining skill of people that succeed in this new era.
— Gui Seiz
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome