The secret to better AI prototypes: Why Tinder's CPO starts with JSON, not design | Ravi Mehta

The secret to better AI prototypes: Why Tinder's CPO starts with JSON, not design | Ravi Mehta

How I AISep 29, 202554m

Claire Vo (host), Ravi Mehta (guest)

Vibe prototyping vs. structured prototypingSpec-driven vs. data-driven prototypingJSON-first workflow and schema thinkingMCP servers for tool access (Unsplash MCP)Reducing hallucinated media and broken linksIterating prototypes by editing/swapping datasetsMidjourney prompting: subject–setting–styleUsing film stock and camera metadata for image qualityTaste vs. craft in the AI eraConsumer AI: delight, personalization, psychology

In this episode of How I AI, featuring Claire Vo and Ravi Mehta, The secret to better AI prototypes: Why Tinder's CPO starts with JSON, not design | Ravi Mehta explores better AI prototypes by starting with JSON data, not design The conversation contrasts common AI prototyping habits—writing one big prompt or uploading designs—with a “data-driven prototyping” approach that begins by generating a realistic, schema-shaped JSON dataset for the feature.

Better AI prototypes by starting with JSON data, not design

The conversation contrasts common AI prototyping habits—writing one big prompt or uploading designs—with a “data-driven prototyping” approach that begins by generating a realistic, schema-shaped JSON dataset for the feature.

Ravi shows how separating concerns (data generation vs. UI/code generation) yields prototypes that look more authentic, break less (fewer hallucinated URLs), and are easier to iterate by swapping datasets rather than rewriting prompts.

They demo using Claude to generate itinerary JSON enriched with real Unsplash images via an MCP server, then pasting that JSON into Reforge Build to generate a cleaner, more accurate trip-planning prototype.

In the second half, Ravi introduces a subject–setting–style prompting framework for Midjourney, emphasizing photographic vocabulary (film stocks, camera/lens metadata, lighting via setting) to consistently achieve more “usable,” less uncanny images.

Key Takeaways

Separate UI generation from data generation to raise prototype quality.

When the prototyping tool must design UX, invent data, fetch media, and write code at once, outputs tend to be “average across tasks. ...

Get the full analysis with uListen AI

Start prototypes with a realistic data model, not just UX descriptions.

Ravi argues engineering naturally begins by defining schemas that remove ambiguity; applying the same discipline to prototyping produces more functional and flexible prototypes—especially for established products with real constraints.

Get the full analysis with uListen AI

Use JSON as the contract that makes iteration fast and safe.

Once the prototype is wired to a sample data file, you can rename a traveler, replace a cover image URL, or regenerate an entire destination (Paris → Thailand) by swapping JSON—without reworking the UX prompt or code structure.

Get the full analysis with uListen AI

Real media sources reduce “broken prototype” credibility gaps.

The spec-driven demo shows typical failures: hallucinated image URLs and mismatched photos. ...

Get the full analysis with uListen AI

Stress-test UX with production-like data (especially UGC).

Claire highlights how real-world content (odd crops, long text, messy user input) exposes edge cases that polished Figma mocks hide. ...

Get the full analysis with uListen AI

Agentic workflows work because context is staged, not dumped.

They connect the method to “agents”: one step/tool generates authentic structured data (and media), the next generates the app around it. ...

Get the full analysis with uListen AI

For Midjourney, prompt like a photographer: subject–setting–style.

Instead of generic descriptors, specify the subject, define setting including lighting via scenario (e. ...

Get the full analysis with uListen AI

Camera/film metadata is a reliable ‘quality lever’ for images.

Referencing items like “Kodak Tri-X” or “Leica 50mm f/1. ...

Get the full analysis with uListen AI

Notable Quotes

Design systems and UX descriptions are not the foundation of great prototyping. In fact, JSON and data models should be.

Claire Vo (intro framing of Ravi’s thesis)

One of the first things that [engineering] do is they say, 'Here's the data schema that's actually gonna drive the front end.'

Ravi Mehta

When you provide data in this way, the AI doesn't get fuzzy with it. Actually, we'll just take the data and use it as is.

Ravi Mehta

If we cut all our nice to haves, our product is not gonna be nice to have.

Ravi Mehta

The two fundamental inputs into creating something are taste and craft.

Ravi Mehta

Questions Answered in This Episode

In your workflow, what’s the minimum viable JSON schema you’d generate first (fields and nesting) before you ever ask a tool like Reforge Build to create UI?

The conversation contrasts common AI prototyping habits—writing one big prompt or uploading designs—with a “data-driven prototyping” approach that begins by generating a realistic, schema-shaped JSON dataset for the feature.

Get the full analysis with uListen AI

How do you decide when to use real production data vs. synthetic data—especially with privacy/PII constraints—and what’s your recommended ‘safe proxy’ approach?

Ravi shows how separating concerns (data generation vs. ...

Get the full analysis with uListen AI

What failure modes have you seen when teams paste large JSON blobs into prototyping tools (context limits, incorrect parsing, UI overfitting), and how do you mitigate them?

They demo using Claude to generate itinerary JSON enriched with real Unsplash images via an MCP server, then pasting that JSON into Reforge Build to generate a cleaner, more accurate trip-planning prototype.

Get the full analysis with uListen AI

Can you show an example where editing the data model (adding a new entity/relationship) cleanly cascades into the UI—versus breaking the prototype—and what patterns enable that?

In the second half, Ravi introduces a subject–setting–style prompting framework for Midjourney, emphasizing photographic vocabulary (film stocks, camera/lens metadata, lighting via setting) to consistently achieve more “usable,” less uncanny images.

Get the full analysis with uListen AI

For media: beyond Unsplash, which other MCP-style tools/services are most valuable for prototyping (e.g., maps, reviews, translation), and how do you compose them?

Get the full analysis with uListen AI

Transcript Preview

Claire Vo

PMs and designers are prompting prototyping systems that they don't quite understand how to get the best outcomes from. I'm always impressed that a prototype gets generated, but sometimes it's just, like, not quite what I need for the product I'm building or the experience I'm trying to craft. And so I know you have come up with a system called data-driven prototyping, which you're gonna show us.

Ravi Mehta

The thing that we can do is we can help the LLM by starting to separate out the idea of not just generating the UI, but also by helping it with the data. So I've got a prompt here. It says, "Using JSON," because we want it to be structured data, "generate a sample itinerary that I can use to prototype a shared trip itinerary feature. The destination is Paris."

Claire Vo

I just think about the human parallel to this, which is searching through stock photos, trying to find which one is representative. It just takes so much time, and because an MCP now can, like, programmatically go through the tasks to be done using these external tools, it just makes it a lot faster to get higher quality media into your prototypes.

Ravi Mehta

So this is the finished prototype based on that prompt. We can see it generated 22 different files. It's a really nice componentization. It's got a little bit of sample data in there, and it generated mock data, so we can see what day one looks like. We've got some photos in there. We can see what day two looks like.

Claire Vo

[chuckles] This will be you teaching me how to actually bring some data and structure to my vibe designing and prototyping. This is genius. I'm really excited. [upbeat music] Welcome back to How I AI. I'm Claire Vo, product leader and AI obsessive, here on a mission to help you build better with these new tools. Today, I am giving you elite prompting strategies from Ravi Mehta, who is CPO at Tinder and a product leader at places like Facebook and TripAdvisor. Ravi's gonna show us how design systems and UX descriptions are not the foundation of great prototyping. In fact, JSON and data models should be. He'll also walk us through how to use structured prompting in Midjourney to get high-quality photos and images for your prototypes. Let's get to it.

Speaker

This podcast is supported by Google. Hey, everyone, Shresta here from Google DeepMind. The Gemini 2.5 family of models is now generally available. 2.5 Pro, our most advanced model, is great for reasoning over complex tasks. 2.5 Flash finds the sweet spot between performance and price, and 2.5 Flash Lite is ideal for low-latency, high-volume tasks. Start building in Google AI Studio at ai.dev.

Claire Vo

Hey, Ravi, thanks for coming on How I AI. I'm excited to see some of these workflows that are gonna be really useful for me.

Ravi Mehta

Thanks so much for having me. Uh, I'm excited to go through it, too. I've been having a ton of fun playing with these things.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome