Skip to content
Aakash GuptaAakash Gupta

Give me 60 minutes, I'll make your AI Designing 81% Better

Xinran Ma breaks down the complete AI design workflow. Mind map of AI design, live demos in Google AI Studio and Lovable, plus the exact tools top designers use. Full Writeup: https://www.news.aakashg.com/p/xinran-ma-podcast Transcript: https://www.aakashg.com/designing-ai-products-the-right-way-google-stitch-custom-gpts-and-prototyping-workflows-with-xinran/ --- Timestamps: 0:00 - Intro 3:54 - What Is Designing with AI? 8:02 - The AI Design Mind Map 12:29 - Ads 14:00 - Custom GPT to PRD Workflow 26:10 - Building Custom GPT Live Demo 30:02 - Ads 32:25 - Generating PRD for Prototyping 36:07 - Comparing Lovable vs V0 vs Bolt 41:23 - Stitch to AI Studio Workflow 43:31 - Google Stitch Live Demo 49:53 - YOLO Mode for Divergent Solutions 56:06 - Advanced Google AI Studio Tips 57:32 - How Cursor Stacks Up 59:37 - Final Takeaways 1:01:00 - Outro ---- 🏆 Thanks to our sponsors: 1. NayaOne: The fastest way to test AI and fintech solutions - https://nayaone.com/ 2. Pendo: The #1 software experience management platform - http://www.pendo.io/aakash 3. Maven: Get 15% off Xinran’s course with my link - https://bit.ly/3Y2FUZn 4. Bolt: Ship AI-powered products 10x faster - https://bolt.new/solutions/product-manager?utm_source=Promoted&utm_medium=email&utm_campaign=aakash-product-growth 5. Gamma: Turn customer feedback into product decisions with AI - https://gamma.app/?utm_campaign=prompt&utm_content=Aakash+Gupta&utm_source=LinkedIn --- Key Takeaways: 1. AI design covers five areas not just prompts - Prompting, ideation, design/prototyping, workflows, and staying conscious. Most people think better prompts equal better design. That's just 20% of the skill. 2. Use Google AI Studio for quick design variations - Upload 2-3 visual references. Describe what you want. Generate three different design directions in 5 minutes. What used to take 3-4 hours now takes 15 minutes. 3. Lovable builds functional prototypes in seconds - Describe the experience you want to build. Lovable generates a working prototype in 60 seconds. Not mockups—actual clickable experiences you can test with users. 4. Match tools to specific use cases - Custom GPT for effective prompts. Lovable for high-quality prototypes. Magic Patterns for design variations. Google AI Studio for free exploration. Cursor for full-stack experiences. Claude Code as all-purpose best. 5. Good design passes four layers not just visual - Visual representation, problem-solving, design principles, and implementation feasibility. Most people stop at layer one. Great design works at all four layers. 6. Context matters more than prompt length - Don't say "design a button." Say "design a primary CTA button for B2B SaaS onboarding where users connect calendar. Professional brand." Specificity drives quality. 7. Visual references anchor AI output - Upload 2-4 screenshots showing the aesthetic you want. These show AI what "modern and minimal" means to you. The quality difference is massive versus text-only prompts. 8. Iteration speed determines final quality - The magic isn't in the first output. It's in the 10th iteration after you've refined and tweaked. Review, identify issues, tell AI how to fix, repeat. 9. Always validate with real users - AI tools make generating designs easy. Only users tell you if those designs actually help. Show prototypes to 3-5 users. Watch them try to use it. 10. Workflows changed from linear to parallel - Before AI: sequential steps taking weeks. After AI: describe, generate, iterate freely. This is how top 1% designers work now. --- 👨‍💻 Where to find Xinran Ma: LinkedIn: https://www.linkedin.com/in/davidmaxinran/ Newsletter: https://www.designwithai.co/ 👨‍💻 Where to find Aakash: Twitter: https://www.x.com/aakashg0 LinkedIn: https://www.linkedin.com/in/aagupta/ Newsletter: https://www.news.aakashg.com #aidesign #productmanagement --- 🧠 About Product Growth: The world's largest podcast focused solely on product + growth, with over 200K+ listeners. 🔔 Subscribe and turn on notifications to get more videos like this.

Aakash GuptahostXinran Maguest
Feb 20, 20261h 1mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:003:54

    Intro

    1. AG

      PMs are designing AI products completely wrong

    2. XM

      Designing with AI isn't about prompting. It's about understanding the entire workflow, the system, the constraints, and the behaviors

    3. AG

      Xinran is using Google AI Studio and Stitch to create full-fledged prototypes that are actually designed well

    4. XM

      I can make it very refined or YOLO, which is going crazy. You know, everyone can rely on AI to generate something, but again, AI is not human, right? It does not really have enough empathy, and that's why this GPT is meant for the very first prompt to AI prototyping

    5. AG

      And you didn't use Claude for the prompt. You preferred ChatGPT for the prompt

    6. XM

      I like to shift it to ChatGPT in order to save tokens for Claude

    7. AG

      [laughs]

    8. XM

      There are also other tools like, uh, Subframe, Magic Patterns, but they are more specific. I know many people are always facing the blank canvas question. I wanted to design something out of AI, but I'm not really clear about what I wanted to design for

    9. AG

      Wait, show me what that really means. Before we go any further, do me a favor and check that you are subscribed on YouTube and following on Apple and Spotify podcasts. And if you wanna get access to amazing AI tools, check out my bundle, where if you become an annual subscriber to my newsletter, you get a full year free of the paid plans of Maven, Arise, Relay app, Dovetail, Linear, Magic Patterns, DeepSky, Reforge Build, Descript, and Speechify. So be sure to check that out at bundle.aakashg.com, and now into today's episode. [fire crackling] Xinran, welcome to the podcast

    10. XM

      Thank you so much for having me

    11. AG

      My pleasure. I've been following you online, and I think you are one of the world's leading experts on Design with AI, the title of your newsletter, and I'm really excited for you to bring this knowledge to everybody. And I think one of the first things we need to do is we need to build the neural circuitry to understand what is Design with AI. We need to build the neuronal connections. We need to build a mind map. So can you explain to us what is Design with AI? What is the universe of Design with AI that people should know?

    12. XM

      Design with AI covers a lot of things. You know, design and AI are very broad topics. So to make things easier to understand, I actually did a mind map. Let me show you. First of all, there are other part of Design with AI, which is about prompting, like how can we prompt better in order to get better results? There are also about technical things, like how can we ideate with AI better, using AI as assistant to generate better ideas that human is hard to, to think, think about. Not to mention there are also other design areas such as design and prototyping with the help of AI, and there are some workflows that I can explain more. Last but not least is a part of the landscape that fewer people talk about. That is, aside from those tactics about prompting, designing, ideating, and prototyping, there's also a part of world where we need to design, we need to stay conscious designing with AI. And staying conscious mean how can we bring more intentions and thoughtfulness into the things that we're designing, and that also involves in the awareness of different kind of risks and how to mis- mitigate those risks. So that's just a, a broad overview of the items, and I can walk through each of the section really quickly, and if you have queni- any questions, Aakash, feel free to ask me

    13. AG

      Yeah, let's start at the top, prompting with AI. I guess it's kind of a universal skill. A lot of people might kind of roll their eyes, like I've seen so many influencers post prompting frameworks. What do they need to know about prompting with AI to use these AI design tools well?

    14. XM

      Cool. Let me zoom in a little bit because it's easy to, to get distracted when you see a lot of things. So let's start with prompt wi- with AI. In my opinion, I've got to see many prompting frameworks, and I believe, Aakash, that you've seen over the years as well. There are m- so many of the frameworks out there. But in my opinion, to its core, prompting with AI is just a way of how human interact with AI in order to get better results. And I try to make things as

  2. 3:548:02

    What Is Designing with AI?

    1. XM

      simple as possible in... from a design perspective, as opposed to, you know, introducing so many frameworks that people can choose. So from that regard, I jotted down some of the areas that I think are essential to prompt with AI to get better design results. The very first one is clarifying the ask. Before we ever even engage with AI, it's important to know what we want to get out of AI. So you can see clarifying the ask as request. What do you want to get out of AI? What do you want to include? What do we want to avoid? And if you don't know those things, which is okay, because oftentimes it depends on the scenarios, we just don't know what we want, and that is okay. We can leverage people for help. We can even ask AI for help to gain some clarity. So those are really the small things around clarifying the ask. Context is also very important, and I believe, Aakash, you've seen this trend of context engineering becomes more trendy terms versus prompt, prompting, prompt engineering. It's just because of the fact that people realize that how important context can be, and it can amplify the effectiveness of prompt. And regarding context, first of all, I wanted to call out that not more context means better, but the necessary context that is related to the goal. I think those are the things around context that really matters. That's why, as you can see, at the very first part I mentioned about providing context, it's really about clarifying the outcome so that the clarity can help inform what to provide for the context, and only providing the necessary context that involves what role are you in, what are the unmet user needs, what timeline are you facing, basically constraints, right? Time constraint, technical constraints. Is there any existing ideas that AI needs to know? Because if you have many ideas al- already, you might want AI to know, right? It really depends on your perspective. Some people wanted to hide some ideas from AI so AI can come up with some fresh ideas, but some people want AI to have something to base off from, as opposed to, you know, drifting too far. So it's really about that. What is your prioritization criteria? AI struggles with priority because it does not really know what you want. So if you have something to prioritize and to focus on, that information could be really helpful. Who is the audience? Audience matters when you're trying to frame experience for a particular set of users, and that really helps if you can provide the audience information. What are the design principles and what are the brand guidelines? Those are all important constraints that you can provide to AI, depends on different scenarios. There are some miscellaneous points like be w- aware of the existing memory, uh, refresh certain memory by open up a new chat, but those are just technical things. Providing references also helps because we have learned that when providing references, whether it's text reference, output format, visual references, code references, AI tends to give you better design outcome. There are some extra tips around prompting, such as structuring, structure out, simplicity, it's okay to be imperfect, reverse prompting, and so on and so forth

    2. AG

      How-Much should you define the exact design you want. Because of course, AI is gonna do a much better job at giving you what you want if you're really specific. But at the same time, then you kind of lose that exploratory lens and magic. And part of the magic is you can get three to four divergent designs. So I think we'll have to explore that as part of one of the workflows. But when you think about prompting, do you feel like you need to prompt these design tools with a very structured mega prompt, or you can just do like the shorter kind of to-the-point prompts?

    3. XM

      I think it's an art in some ways. By saying art, meaning sometimes we want more control of the outcome, and sometimes we want to give AI more space to brainstorm, and it's really a fine balance. It's as if you are hiring someone, right? Sometimes you want to be very specific, and sometimes you wanted to give her or him a vast amount of creativity or space to, to brainstorm. So

  3. 8:0212:29

    The AI Design Mind Map

    1. XM

      in my... Based on my experience, it really depends. Regardless, I think some of the structure helps. If you can introduce it relatively early, that often helps, as opposed to, you know, starting everything from scratch. You know, it is okay to do things that way, but it also requires a lot of time and tokens for sure.

    2. AG

      Awesome. So maybe we can go into the details of it as we think about ideation with AI, which I think was one of your next points.

    3. XM

      Sure, yeah. Ideation, ideating with AI. I actually have two parts, which is ideating with AI design and prototyping with AI. They really go hand-in-hand, as opposed to, you know, this is ideation, this is design, this is prototyping, right? Given AI has bring some convergence to those, those things. But just wanted to give some ideas around those early stage ideation, such as brainstorming idea, right? Let's say, Aakash, we are thinking of a fresh product ideas. We need some early stage brainstorming, such as, you know, providing necessary business goals and problems, user goals and problems, just to provide certain context or guardrails and, you know, relevant user insights, relevant constraints, and, uh, those are the things that we can provide for some early product idea brainstorming. And regarding convergent thinking is also important because AI can get really, really wide and extensive, and that's why we also need things to, to converge a bit. For example, ask for ren- for ranking ideas or ask for examples for better evaluation. Like, you know sometimes you ask AI, AI give you a lot of ideas, but you don't know where that idea come from. So it's basically asking you, like, "Where... Can you keep providing me some examples, and where did the ex- examples come from? What are the sources?" So you can use those to do your due diligence if AI is just making things up or you think that the sources that AI provide ended up being credible, right? Just for your, for your own sake, for better evaluation. So that's that. Regarding design and prototyping with AI, it's more like an extension of ideating with AI. So that's why I said it, they go hand-in-hand. There are some common best practices, right? Specifying instruction, specifying the context, brainstorming design variations, keep track of design variations, navigate between options. So Aakash, as you can see, those areas like the first, let's say the second one and third one, it can be reflected in prompting, right? The very first one can be indirectly reflected in prompting. But the fourth, fifth, and sixth ones are become more ambiguous. It's more like how you operate within the design landscape versus mere prompting. And those are the things were a challenge like years ago. But as more and more AI tools realized this pain point, they started to have new features that addressed those pain points that is easier for designers or non-designers to design, I would say. So those are some principles and prac- best pra- common practices, I would say, around design with AI. And of course, there are workflows. Workflows I put into two big buckets. One is more like from an idea to design, and oftentimes it is leaning more heavily on text or ideas versus actual existing designs. And I included here, you can clarify the ideas through different AI tools such as ChatGPT, Claude, and Gemini, or I will mention a custom GPT that I've built in a bit. So it's essentially a type of ChatGPT, it's just that I provided some extra thoughts to it to save people's time and to save my time. Regarding AI tools, I can explain further about the nuances between different AI tools so that those are not treated equally, and it really depends on what you want to get out of them. Regarding from an existing experience to design, well, that actually requires more guardrails around what you want to do, right? It may be you have existing website that you can take snapshots from, or you have existing Figma file, because Figma is the go-to design tools by many designers. Um, right, so there are different flow- workflows I can talk about in this area as well.

  4. 12:2914:00

    Ads

    1. AG

      Today's episode is brought to you by NayaOne. In tech buying, speed is survival. How fast you can get a product in front of customers decides if you will win. If it takes you nine months to buy one piece of tech, you're dead in the water. Right now, financial services are under pressure to get AI live. But in a regulated industry, the roadblocks are real. NayaOne changes that. Their air-gapped, cloud-agnostic sandbox lets you find, test, and validate new AI tools much faster, from months to weeks, from stuck to shift. If you're ready to accelerate AI adoption, check out NayaOne at nayaone.com/aakash. That's N-A-Y-A-O-N-E.com/A-A-K-A-S-H.Today's podcast is brought to you by Pendo, the leading software experience management platform. McKinsey found that 78% of companies are using gen AI, but just as many have reported no bottom line improvements. So how do you know if your AI agents are actually working? Are they giving users the wrong answers, creating more work instead of less, improving retention or hurting it? When your software data and AI data are disconnected, you can't answer these questions. But when you bring all your usage data together in one place, you can see what users do before, during, and after they use AI, showing you when agents work, how they help you grow, and when to prioritize on your roadmap. Pendo Agent Analytics is the only solution built to do this for product teams. Start measuring your AI's performance with Agent Analytics at pendo.io/aakash. That's P-E-N-D-O.I-O/A-A-K-A-S-H.

  5. 14:0026:10

    Custom GPT to PRD Workflow

    1. AG

      How many designers are using the Figma Make to Figma workflow?

    2. XM

      Yeah, great question. So I've talked with many designers. The design, the product designers in the States are using Figma Make quite a bit just because the fact that they have the Figma paid plans. That's why they have access to Figma Make already. But when I talk with people outside of the US, for example, I talked with another designer who's based in, uh, Asia, and he told me that they, not a lot of people are u- are us- are using Figma Make. And I was su-surprised because they do have access to Figma. I don't know why. And for Europe, some of them, and I know some of them, they just don't have access to Figma Make for, like, whatever reasons.

    3. AG

      So it seems like if designers get access, they are using it quite a bit. So that's really interesting, the pickup on these AI prototyping tools, not just for PMs, but clearly for designers as well. What's that fourth area you had about being conscious with AI? Can you give us the next layer deeper on it?

    4. XM

      Sure. The being conscious with AI, it's more about aside from all the tactics of how can we use AI to design better, smarter, and faster. I do wanted to bring another layer of things. You know, I may not have a lot of time to cover this today, but I believe that for during some of the workflows, this is already indirectly interjected in some of my demos. It's basically how to bring more intentions and thoughts around what you're generating, as opposed to fully 100% being driven by AI. I still think some control, um, as a human or some intentionality w- can go a long way. So for example, some awareness about risks. You know, like I just mentioned about the early brainstorming with AI product ideas, right? We need to be aware that there are some hallucination, you know, biased insights, outdated insights, irrelevant insights, or low-quality generic insights that can be part of the insights that we subconsciously have brought in. Like, for example, let's say I'm designing for something. Let's say designing for whatever experience, like, you know, restaurant booking experience, right? If I were just to ask ChatGPT, "What's the main pain point of the users?" It can give me very, like, general insights or even biased insights that are tailored around a specific type of users, right? So that could be dangerous, or they just make things up, or they provide generic insights that not really helpful. So that's that. Regarding recommendations are just ways to mitigate your risks. Like, for example, you know, keep human in the loop during the research ideation process. Double-check the sources. Empathy with the people that you're designing for becomes even more important because, you know, everyone can rely on AI to generate something. But again, AI is not human, right? It does not really have the enough empathy with the people who are you... They, they don't really care to a certain extent. Including diverse perspectives in the process become very important. Again, this is also about design intentionality, and this is a huge deal, uh, for, for product designers. Giving user control over automated decision and important data, I think this is pretty s-self-evident. Audit for AI outputs, uh, nuanced information matters. Mere transcripts, so this is, uh, also another interesting point I want to call out. It's just because of fact I noticed a lot of, um, PMs and including designers, they would fetch different information. They would feed information into different AI tools, and they get certain insights, and then they just build and design on top of those insights. But again, there are so much to it. For example, the behavioral insights or sarcasm or hesitation, right? Those kind of behavior cues that can... It, it's really hard for AI to understand with mere transcripts. So just a way to call out that, you know, before we're actually generating something out of AI, just make sure the insights are high quality.

    5. AG

      Makes sense. So if you had to pick two workflows to demo for folks, what are the things that people need to walk away and learn from this video?

    6. XM

      There are two things. I can demo one of the workflow, which is around this area that is from a blue sky idea, more or less, to design, because some people are working in this area. I can also give another workflow, which is about from existing experience to z- design. I can give a demo of something. Like, both of the workflows that I will show today are not very commonly used by people, and I thought maybe that can also bring some fresh perspectives.

    7. AG

      All right, let's do it. And promise me you're gonna show me all of the details, right? Not hide anything from us.

    8. XM

      Of course. Of course. Uh, actually, as a matter of fact, I would be surprised if there's no bugs occurred during a live AI demo. That's just the nature or reality of things, and I wanted to show if in case that happens to you all as well.

    9. AG

      All right, let's do workflow one.

    10. XM

      So the workflow I'm going to explain is more about custom GPTs to PRD for prototyping and a lot of AI tools. I know this slide can be confusing what it really mean. Let me explain a little bit. So first of all, I have built a custom GPT. A custom GPT, uh, you can consider that as a ChatGPT, but with my personal instructions, and I will show you how I follow the steps within that custom GPTs to generate a effective, quote-unquote, "PRD for prototyping." So it's not a broad PRD for everything, it's only for AI prototyping. So then I can copy-paste that PRD for prototyping into any of the AI tools that you want. Although, as I mentioned earlier, there are some difference between different AI tools, but for now you don't have to worry too much about that area. We can just focus on this area first, and during the demo I'll also, you know, paste into a couple of tools to give you a sense of how it, how it works. Does that sound good to you, Aakash?

    11. AG

      Yeah. Maybe you can show us, like, if somebody was building this custom GPT from scratch, how they would build it.

    12. XM

      Cool. I think that will also touch upon some areas, um... Do you want me to show you how to build a custom GPT from scratch or more about the frameworks, how I can better build a custom-

    13. AG

      Well, how to build this one right, because I guess this custom GPT is about PRDs for prototyping tools. So maybe we can just kind of show people, like, these are the inputs I would put into the custom GPT generator.

    14. XM

      Sure, sure. Um, I can give you some of the general guidelines. As I mentioned earlier, right, effective prototypes is about request, what you want to get out of AI, the context, and reference, right? Those are very broad items. I can be a little bit more specific here. So in this particular scenario is that I don't know Aakash, but I know many people are always facing the blank canvas question. I wanted to design something out of AI, but I'm not really clear about what I, what I wanted to design for. That's why I have to vibe coding, quote-unquote, right? One sentence at a time. But sometimes it can easily go into loops, which is frustrating because you're not clear in the first place, and that's why I started to build this custom GPT, because I wanted to bring myself some clarity around what I need to put into the prompt.

    15. AG

      Mm.

    16. XM

      And that speak, that is tied to the context. For example, who am I designing for, right? That is important, and sometimes when we're writing things in AI tools, this is something can be neglected. What are their needs? This is also a question that I put in the custom instruction of the custom GPT just to give me some thoughts to think about. What are we trying, what are they trying to achieve? And this is about user goals, right? And which experience do you want to focus on? This is a interesting one because oftentimes we focus on too many things, and AI struggles if we just wanted to build a full stack app from scratch. And that's why I want to bring myself or bring others some clarity around which specific area, whether it's the platform or the user flow that you want to prioritize building first. And once that gets figured out, it's much easier for you to scale up. For example, if the mobile web solution is good, then it's much easier for you to scale it up or to apply to a, another type of experience. And let's say if the core user flow is figured out, then the login, logout experience become so easy because it's not addressing the core user needs. It's pretty standard elsewhere. That's why I try to be hyper-focused for the very first prompt, and very first prompt matters. And I, I believe, Aakash, you have some similar experience that the very first prompt will set the stage for a lot of things, and that's why this GPT is meant for the very first prompt to AI prototyping.

    17. AG

      Got it. So you load up a... Go to custom GPT generator, you load it up with this information. This is what an effective versus a bad prompt is, and this is the context questions I want you to answer. Makes sense.

    18. XM

      Yes. And again, this is just one type of the custom GPTs. You can build your own custom GPTs based on your needs. So this is more of a simplified version of quickly generate something or quickly generating a effective prototype that matches your idea, and you can build more, you know, more complex things over, over it.

    19. AG

      Yep.

    20. XM

      Cool. All right, so I can give a demo. So let me drag this over here, and I can click here to start. Again, this is also what I built in the custom GPT as well, so just make it easier for people to onboard. All right. Greatly, great, let's get started. As you can see, I broke questions into different smaller chunks that is easier for user to control. So what is the main goal of your product? For the sake of the demo, I would just go with whatever comes, you know, shows here by default. You know, if we have time towards the end, we can... I'm, I'm happy to go crazy, like we can come up with something from scratch. But for the, for the sake of demo, let's just go easy with whatever's suggested here. So let's say help user track expenses, right? That's the project goal or what user want to do. What are the intended user of the product? So let's say, um, freelancers.

    21. AG

      Okay, so you've built this custom GPT in sort of a way where it's extracting the information from you before it gives you the prompt.

    22. XM

      Right.It's like asking me some questions in order for it to better perform the task.

    23. AG

      Yeah.

    24. XM

      Cool. Uh, third question: What platform is your product for? And this is also something special about the custom GPT, is that I make it really specific, as you can tell here. So in this case, I can just type a number, like let's say one. All right. A responsive... Okay, so those are the key user flows that help users achieve their goal. As you can see here, Aakash, that I-- it does not generate login or it does not recommend or provide login and logout as a option, and I did that on purpose. It is specifically called out in the system instruction, is that I do not want that to be a r- part of recommendation. The reason is that if you're just using some other ChatGPT, like using other custom GPTs or the general ChatGPT, chances are the first thing it will recommend you do is to include the sign up, sign in experience. But to me, that is not the key user flow that we need to focus on first, right? It's not directly solving the user problem. That's why, see, it generated some ideas for you. So I can say something like, you know, one and, um, maybe part of two. It really depends. I can say one, I can say two, I can say three, I can say all of them. Again, I try to keep things simple. All right, so here's a preview of your design spec draft. This is a preview, and I can always check, you know, if there's any issue or s- something like that. But

  6. 26:1030:02

    Building Custom GPT Live Demo

    1. XM

      as you can tell, this is a very lightweight PRD, and I don't really call it a PRD. It's because it's more like a spec, a spec for AI tools to generate a prototype. So the focus is really on front-end presentations, such as components and interactions, right? Let me scroll down. Like, do you get to see those kind of like back-end logics or technology? You don't, just because of I s- specifically ask it to not include those things so that, you know, AI can be more focused to generate the, the front-end experience first. So that's that, and I always encourage people to make some revisions and double-check if that makes sense, as opposed to just blindly rely on what AI has generated. All right. This preview looks good, good to you, and this is also a checking point that I built in the system instruction, right? Check if does it look good to you. If not, make revisions. If yes, it will generate the final markdown. So let's, let's say yes. This is the markdown format in a code block, and I specifically asked for this in the back end because it just make things easier for me as opposed to... First, as opposed to copy-paste into any other AI prototyping tools, and also I made it easier for me because it's a markdown format. It can retain, better retain the hierarchy of the spec so that it can make AI easier to understand in order to get better results.

    2. AG

      Yeah. I see you're kind of doing two things here, right? You're not making this too long so that you experience context rot, but at the same time, you are being very, very specific about these are the four screens I wanna see, this is the key user flow. So you're defining the front end quite specifically, uh, but nothing else.

    3. XM

      Right. I try to find a ba- balance, and I know some people prefer a bit more details, some people, some people prefer less details, but this is more like a sweet spot that I found for a lot of my use cases. Again, you know, other people can customize from there.

    4. AG

      Cool.

    5. XM

      All right. So let's say we're happy with this, right? I'm going to paste into any of the AI, popular AI prototyping tools. Let me start with Claude, or we can call it Claude.ai, or Claude Artifact, right? The reason I like about Claude is not because it can generate the best quality of design. It's more about it's a well-rounded tool that can give me a quick mock run of what I can expect from this prompt, right? If I were to ask you, Aakash, if this makes sense, it's hard for you to visualize things, right? It's hard to tell just because there are so much information. So I like to use Claude.ai as a lightweight mock run tool to double-check if the prompt makes sense-

    6. AG

      Mm

    7. XM

      ... or if it's running into any errors or something I'm not aware of. If so, I can always go back to the drawing board to revise.

    8. AG

      Got it. And you didn't use Claude for the prompt. You preferred ChatGPT for the prompt?

    9. XM

      Yes. Uh, great question. The reason I do this is that just because of the fact that I know Claude can be very powerful in terms of code-related tasks. That's why I like to offset things are not code-related in ChatGPT or Gemini. For example-

    10. AG

      Yeah

    11. XM

      ... clarity-related stuff, I like to shift it to ChatGPT in order to save tokens for Claude.

    12. AG

      [laughs] Yes, the Claude tokens. I'm on the, uh, 20X Max plan right now.

    13. XM

      Oh, it's amazing.

    14. AG

      I feel your pain.

    15. XM

      That's incredible. So that's not a n- that's not a problem for you then.

    16. AG

      Well, I still hit the limits. I push Claude Code to the limit every day. [chuckles]

    17. XM

      Uh, it's funny because I, yes, like I hit the limit as well a couple of times, but I'm not on the max plan though.

    18. AG

      If you've

  7. 30:0232:25

    Ads

    1. AG

      been enjoying today's episode, you are going to love Xinran's course, AI for Product Designers. It is on Maven. It has 4.6 stars with 116 reviews. It is a two-week intensive live boot camp where you're not just gonna learn what we showed in this episode, but the next layer deeper. You're gonna learn from all of his expertise in trying out all these different tools. Hundreds of other students have taken his course so that they can get better at product design. We're all feeling that anxiety about AI. If you want to become better at product design, whether you're a PM, product designer, or engineer, check out his course, AI for Product Designers, on Maven. You get a discount by using my code AAKASHXMAVEN. It's $100. We might even update it to more, so go get that code, use my link, check out his course, and now back to today's episode. Here's the dirty secret about prototyping. You spend two weeks building a prototype. You validate your assumptions. Engineering loves the direction. Then what happens? You throw the whole thing away. Bolt changes this completely. When you prototype in Bolt, you're not building throwaway mockup. You're building real front-end code that integrates with your existing design system. So when you hand it to engineering, they don't throw it away, they ship on top of what you've built. I use Bolt every single day. I host my LAN PM job cohort on it, and honestly, I'm up till 2:00 AM some days just vibing in the tool, having fun, and building. That's when you know a product is good, when you're using it past midnight, not because you need to, but because you want to. Check out Bolt at bolt.new/aakash. That's B-O-L-T.N-E-W/A-A-K-A-S-H. Link in the show notes. You know what kills momentum? Spending 30 minutes on a brainstorm with Claude and then copy-pasting it into a slide tool to create the slides. That loop is dead. Gamma now works inside Claude through Claude Connectors, so you can go from thinking to a finished presentation without ever leaving the conversation. Here's what that actually looks like. You're brainstorming a product strategy in Claude. Mid-conversation, you say, "Turn this into a deck." Gamma generates it right there. You keep refining your thinking, and the presentation updates with you. No copying, no pasting, no starting over in another tool. And it gets better. Connect Gmail, Notion, or GitHub alongside Gamma, and Claude will use those connections to pull live data straight into your slides. Weekly business reviews, feature PRDs, all generated from the data you're already using. Your deck finally keeps up with your ideas. Try it now. Connect Gamma to Claude at gamma.app. Go from thinking to presenting in one flow. Claude helps you think, Gamma helps you show it.

  8. 32:2536:07

    Generating PRD for Prototyping

    1. AG

      Okay. [laughs] Yeah, it's, uh, it's an addiction at this point, Claude tokens.

    2. XM

      Yeah, indeed.

    3. AG

      So I guess theoretically you could, just like you architected the custom GPT, you could build a Claude skill or a Claude project to build the prompt if somebody wanted to. The, the keys when they're building that prompt generator, I think we saw were like, make sure you get absolute clarity on the front end, make sure that the user kind of agrees with it, and then make sure it just generates a very simple prompt that's not gonna be like overstuffed with random details like success metrics or this is the user group. Like it's really focused on defining the front end.

    4. XM

      100%, 'cause it's very easy for you to put a lot of details into the prompt, like irrelevant details. And as I mentioned earlier, it's very important to only provide the necessary context that is relevant to your goal. So I, I try to be very careful about here. And as back to what you said, yes, you can build a Claude project, something like that. You can also use Gem by Gemini. But to me, Gem by Gemini is more like a simplified version of custom GPT based on what you can do in the back end.

    5. AG

      Mm.

    6. XM

      But it's also free, which is good for Gem.

    7. AG

      Oh, [laughs] there you go.

    8. XM

      All right, so let's, let's double check what it can do, you know. Because as I mentioned earlier, for Claude, I don't expect something like really great. You know, it's really, really simple.

    9. AG

      Yeah, one thing that immediately sticks out to me is the, the purple, [laughs] which is kind of the infamous mark of designing with AI.

    10. XM

      Oh, that's, that's funny. Yeah. Yeah, like, you know, this is off the, off the screen. It has some... Let's say, let's play with it. Date, let's say December 19th with, uh, whatever, you know, Thai food, which I had this noon.

    11. AG

      Mm, jealous. [laughs]

    12. XM

      [laughs] Well, it's very heavy, so I'm still digesting. [laughs]

    13. AG

      [laughs] You got a curry.

    14. XM

      I, yeah, I got a tofu pad Thai.

    15. AG

      Ah, nice.

    16. XM

      But it's al- also a little bit overpriced.

    17. AG

      [laughs]

    18. XM

      Save expense. View summary. Yeah.

    19. AG

      Wow.

    20. XM

      Thai.

    21. AG

      I mean, that's the coolest thing about these tools is you got three or four screens in a product in minutes.

    22. XM

      Yeah. So excuse me, let me, let me change the, the screen size a little bit so you get a better preview of what it looks like. I have not checked the spec, ironically, 'cause usually I check it, but again, this is a visual representation of what I provided to it, you know, the key flows, adding new expense, see updated summary, something like that. So at least it give me a visual representation of what this prompt will entail. In other words, if I were to feed this into other AI prototyping tools that are more robust than Claude.ai, chances are I might get a similar experience. In other words, if I see something dramatically off here, maybe it's because of the prompt, so that I can go back to the drawing boards to make revisions.

    23. AG

      Okay, so assume the prompt is correct. What's the next step?

    24. XM

      By the way, usually this is already the final step if you are comfortable with prototyping Claude.ai, which I believe not the case for most people. So I like to do a mock run, as I mentioned, to double check if the prompt makes sense. And for the next step, likely, right, I don't think these people want to end at Claude.ai. So we can go to any of the AI prototyping tools that are out there. So let me open up Lovable. So I, I know some PMs like to use Lovable. So let me copy-paste-The same prompt. And, uh, let me paste it in Lovable.

  9. 36:0741:23

    Comparing Lovable vs V0 vs Bolt

    1. AG

      And across all the tools that you've tried, like, how would you describe the pros and cons in your favorite one?

    2. XM

      So, wow, there, there's a lot. Let's start with, uh, Lovable, V0, Bolt, because those three are, were established relatively early around AI prototyping, and in my opinions, they are similar. I know Lovable has... They has built a lot of things over the past six months. So if you're looking for a higher quality design with a very... It has a particular vibe around those design styles. Some people like it, some people not. But if you're looking for a well-rounded tool that can deliver a good design results, and you don't mind paying extra for the well-rounded experience, for the extra features, then Lovable is the one. For V0, I think quality-wise, it's similar with Lovable. I also like the fact that you can edit the code without upgrading to the paid plan, which is more accessible in a lot of ways for V0. But again, I just have under the impression that, you know, Lovable has been really updating a lot of things, and some people prefer V0's design aesthetics over Lovable. It's really a, it's really a personal choice. For Lovable's design vibe is more like very glamorous, you know, very vibrant. So that's, that's the difference. But overall, they're, they're quite similar. The Lovable may have some other bells and whistles. Regarding Bolt, I no longer use as much. It's just because of the fact that V0 and Lovable are doing quite well. But if you're interested in a, like, full stack prototype, then I do find Bolt has more, better integrations because of the, the team behind Bolt. So that's Lovable, V0, and Bolt. There are some other tools like w- as well, like Google AI Studio. It's something relatively new in terms of its prototyping capabilities. Google AI Studio has been around for a long time, but it just recently, it's really shifted its focus from, you know, a development, developer-based tools to, like, vibe coding tools that, uh, everyone can use. And I found it interesting. It's having some of the new features. It's free, which is good for more accessible for many people. But regarding the design quality, I feel like it's still catching up with the tools like Lovable and V0. There are also other tools like, uh, Subframe, Magic Patterns, but they are more s- little bit more specific. They're similar with... Well, Magic Patterns is similar with Lovable per se, similar with V0, but it has some actual features that are centered around product design. And Magic Patterns explicitly does not have a back end. As you can see, that's, that's a difference. That's like a deliberate business decision that they've made. There's also pros and cons depending on what you want to get out of it. So this is the, um-

    3. AG

      That's the Lovable one. At least we don't get the purple. Lovable has put into their system prompt, "Avoid that purple gradient," 'cause Opus 4.5 overuses that. Or maybe it's not using Opus 4.5 on the back end. We don't know. This is a simple enough app.

    4. XM

      Yeah. This is also something, a new trend I've seen in AI tools such as Lovable or Cursor, is that they're not just dependent on one specific AI model. They have more than one models from one companies to leverage so that they can get a little bit smarter and less dependent on those, you know, AI model companies.

    5. AG

      Yeah. You mentioned, by the way, Magic Patterns. Magic Patterns is totally free for paid subscribers of my newsletter. They get a year free of that, as well as Reforge Build, which is another one. I think both of those are really specific for, like, front end PM design prototypes. And like you said, like a Bolt or a Replit is much more powerful if you want to build a full stack.

    6. XM

      Right. Yeah, we didn't mention Replit. Exactly. Especially you wanted to build a full stack prototype, I found Replit has a lot of things that can get you hook up to the back end. Again, that also comes at a cost. That means that you have to pay extra for that holistic experience, uh, in Replit.

    7. AG

      Mm-hmm. So what do we take away from this Lovable prototype?

    8. XM

      The Lovable prototype, as you can see, in some, from a detail standpoint, there are small areas that are more upgraded compared to the Claude.ai experience, right? Um, you know, just like random things, save expense. Slightly, there's an arrow state, as you can tell. Um, the dropdown, the color, it's more refined in a lot of ways. And save expense. There's, you know, a confirmation page.

    9. AG

      It's a lot better than Claude. Yeah.

    10. XM

      As you can see, yeah, they, they've spent a lot of work in terms of the, the, how to say? I refer to it as sub-AI agents between the LLM model layer to the, uh, to actual presentation layer. So there's something unique about different AI tools or system prompts, in other, in other words.

    11. AG

      Yeah. It's not just a Claude wrapper anymore.

    12. XM

      Exactly. The actual layers actually helps.

    13. AG

      So I think another thing that designers probably care a lot about is Lovable just shipped visual editing. Do you use that a lot?

    14. XM

      Not that much, in the sense that it is helpful to a certain degree, and it's still, in my opinion, it's still catching up with the similar feature in, uh, V0 and, uh, Figma Make. So it's not that robust in my opinion, but it's getting better and better.

  10. 41:2343:31

    Stitch to AI Studio Workflow

    1. AG

      All right. So shall we move into workflow two? What's the second workflow people need to understand to really understand design with AI?

    2. XM

      Sure. Um, so the second workflow, it's more about new combo, combination of tools that are Google Stitch and Google Stitch and Google AI Studio. Uh, this is very new, by the way, because I know some people know about Google AI Studio, but over the past, I think about two month ago, Google AI Studio announced its new vibe coding whatever engine that you get to build things, build prototypes within Google AI Studio, which is exciting news. However, at this point, Google AI Studio still feels like, you know, like Lovable, that you get to see the code, you get to using, you get to use prompt to generate a prototype, right? But it's lacking something that is the early stage design explorations, and not a lot of AI tools out there can empower that. Stitch is a good example. So Stitch is a Google product. The... About July, Google acquired Galileo AI and rebranded as Stitch. And over the past two months-Or one month specifically, Google Stitch has been actively updating its different features and in a very fast speed. Like every week there are, there were, there are new updates. And Stitch is a tool that can help everyone easily generate design ideas. So you can see the Stitch Google AI Studio combo as, you know, getting the best of both world. Google AI Studio is more like for, you know, prototyping interactions, whereas Stitch is more of early stage ideation. And I found there are some interesting synergy between the two, and that's why I wanted to showcase to you all. And by the way, you can do similar things in Magic Patterns, which I'm more familiar with, by the way, uh, than Stitch. But just given how new things are, um, I wanted to give you all a overview and hope that can be helpful.

    3. AG

      Cool. Yeah, I haven't seen much about Google Stitch yet. Excited.

    4. XM

      It's very

  11. 43:3149:53

    Google Stitch Live Demo

    1. XM

      new. All right. So let me pull out a Google Stitch. So first of all, as you can see right here, is that we can actually paste, directly copy-paste the prompt in my ChatGBT to here. You know, I can copy-paste it here. Uh, this is one use case of using Stitch, is that you generate a set of designs from an idea, similar with what we did for Lovable, right? But for this case, I wanted to try something special. So in this case, uh, let me show you something. Um, if I go, go to redfin.com, if I go to any of the sections... I'm trying to find an example that is easier for people to get. Trying to see if there is a, there is a Ask Redfin. So let's say we are finding a house in Seattle. I wanted to showcase you a particular example that we can leverage Stitch to, to play with. Oh, we're actually in AB test as well right here. Never mind. Um, I can grab another snapshot. I think it's easier for you to, people to understand. So right here, let me grab something. Do you see this? This is a snapshot of a home detail page for a real estate platform, and there's one section called Ask Redfin, and this is the experience of so-called AI chat for people to understand more about the home. Let's say we're designing the experience for this specific area, what could be the other design variants that we can get out of this existing experience? And I wanted to bring this up is because this sounds like a totally different challenge, design challenge than the one we talked about earlier, right? So that just bring some, bring some, uh, diversity to our workflows. So if I were to go back to Stitch. So let's say I copy-paste the design, the existing scenarios here, and I can describe the design. And for the sake of time, right before this call, I actually wrote down some of things as a initial prompt so that I can save my time, um, typing here. So as you can see, I put it in some of the guardrails around what I want it to generate. Number one is context. You are reviewing the AI chat section on a home detail page of a real estate website. When potential home buyers browse a listing and interact with the section, an AI chat window will pop up. So those are just context. The business goal is this. The user goal is that. And this is my ask. Evaluate the existing experience, identify actionable improvements, and generate better design ideas accordingly for the Ask Redfin section. I don't usually do things like that. It's just because I'm combining several asks into one. I could have just, you know, break it down into steps, but for the sake of demo, um, let me do this just for the sake of simplicity. And something interesting about Stitch is that you can toggle between app and web. Um, so in this case, let's start with web. And under here is different models. It's funny because a week ago, there are only three models here, and then there's a Thinking with three po- pro. So how about let's try out this? This is very new, by the way. And what I'm doing now, again, as a recap, is to generate design ideas based off an snapshot of existing design with, along with some of my context and requests.

    2. AG

      So this can be your... On the divergent path when you're exploring divergent solutions, this can be your thought partner.

    3. XM

      Correct. If I were to go to, um, the diagram, again, you can use prompts one idea at a time, or you can generate multiple ideas. And I can explain this further, Akash. So there is an option or feature in Stitch where you can generate multiple options. So it's, you can stay with this mode, or you can switch to a more divergent mode for design ex- brainstorming.

    4. AG

      Mm.

    5. XM

      Cool. Let me close out Lovable, and probably I can close out those two as well and, uh, just wait a, wait a bit for the things to do. All right. So as you can see, by default, it generate two options. I can actually specify, you know, I can make it more. I can generate three options here. So let's take a look of what it generated based on what I, what I ask. There are a lot of things, like as you can see, you know, there is some, still some randomness in the outcome regarding the area outside of the Ask Redfin section. But I'm, I don't care about those section as much, so we can ignore it. So let's look at this section, Ask Redfin. You know, compared to the existing design, it actually saved us some space for the prompt window, prompt box, which is, you know, arguably more clear in this way, and there's also more context around this area. So that's something that's just something as a, you know, like, um, like an idea to think about. What about the other idea? This idea, if what we were to compare with the original side by side, I think let me make that larger if that's easier for you to see. I can-I think I'm gonna do this big. All right. So if you were to compare with that one, it's closer, right? It's similar, but it's just a matter of like spacing and layout. All right, so we have something very preliminary, and those two options help, but let's say we are... Oh, before I go further, I'm gonna take a pause. Any questions you have here?

    6. AG

      I guess they're somewhat divergent. Should we try to ask it to give even more?

    7. XM

      Yeah, sure. So let's talk about this area, like this option. So let's say we are interested in, in exploring this area more. You know, I could have taken another snapshot, but let's just start from there. I can click here, drop down, and there is a Regenerate feature where I can rerun it again if I'm not happy with what I see right here, which is always a little trick for generate more ideas. But I can also do variation, which I use it more often. So let's say for... So first of all, I can define how many options I can generate. So I can make it four, I can make it two, but let's just make three as the default for now. Creative range, I can make it very refined or YOLO, which is going crazy. How about YOLO? Be more or less, let's be more divergent at this point.

  12. 49:5356:06

    YOLO Mode for Divergent Solutions

    1. AG

      Yeah.

    2. XM

      So for the Ask Redfin section, based on the business and user goal, and, uh, user goals that I provided earlier, generate, generate more design options. Aspects to vary, you can, you know, let's say layout, color scle- color schemes. Yeah, why not? Images doesn't... I don't need it. Text font, I don't need it. Text content, yes. So generate variations. It can probably go pretty crazy because we are in this YOLO mode right now, but we'll see.

    3. AG

      YOLO means you only live once, if somebody's wondering. [chuckles] It's funny that they've put that directly into their product.

    4. XM

      I know. It's funny that you can... I have to Google it to look it up every time.

    5. AG

      Yeah, it's from a Drake song in like 2014, I wanna say. So shout out to the rap, Canadian rapper Drake making it into a Google product.

    6. XM

      Great. Should have a, you know, GIF or image in some sort of success message.

    7. AG

      Oh, man. I think like the song he personally published on this was pretty explicit, and so Google probably wouldn't allow it. [chuckles]

    8. XM

      Oh. Cool, cool, cool. That, that makes sense. Otherwise, I was thinking introducing the MV here-

    9. AG

      [chuckles]

    10. XM

      ... while we're waiting. All right. Wow.

    11. AG

      Way more divergent. I like this. So u- so the hack here is use YOLO mode in Stitch to get divergent solutions, and then we're gonna take these from Stitch to Studio?

    12. XM

      Yes. Um, as I mentioned, you can keep ideating from here, right? So if I were to go back to, um, to my r- if you remember my mi- mind map, we can just keep diverging from here. And you can, you can ask a prompt here as well, right? And there are different models that you can play with. By the way, for a fast model that you can export to Figma in the future, and it's something odd, right? It should allow you to export to Figma and Google AI Studio for any of the models here. However, for some reason, it only allows you to export Figma if you're under the fast mode.

    13. AG

      Mm.

    14. XM

      Just a, just a heads-up. Another thing, Aakash, its redesign model is very interesting. It also give you a lot of creative ideas by doing redesign. So-

    15. AG

      Mm-hmm

    16. XM

      ... you wanted to check that out?

    17. AG

      Let's see it.

    18. XM

      More ideas. For the sake of time, let me just give you a... Oh, I think I... Yeah, I think I selected the right one.

    19. AG

      You need to make sure to select the right frame so it knows what to diverge off of.

    20. XM

      Exactly, yeah. And this is a perfect time to introduce any kind of ads or M- MVs.

    21. AG

      Mm-hmm.

    22. XM

      So back to what I'm saying right here, it really depends on needs. Let's say if we are happy with the idea, we think there is some potential, we can export to Google AI Studio for further... Wow. No, it's getting, it's getting-

    23. AG

      Now it's going wild on the design system, but at least it's something different.

    24. XM

      Yeah. That's true. So let's, let's say if we still like this one, I know it's getting... It's not really what we asked for, to be honest, uh, Aakash, but let's go with this one. So let's say we wanted to, uh, export to Google AI Studio. By default, this is the one. And, um, I can say, "Turn this into an interactive prototype." By the way, we don't have to provide anything here. Uh, we can leave this blank because I'll show you in a bit that it just essentially copy-paste the same thing in Google AI Studio. That's that.

    25. AG

      Mm.

    26. XM

      For some developers, I know they like tools, which is more, you know, if you have a GitHub re- repo, um, you can download code and copy the code as well. But those, those are the features that I don't use much-

    27. AG

      Okay

    28. XM

      ... even if they help with it.

    29. AG

      So that's like Google's coding agent, right?

    30. XM

      Mm-hmm. Yep. That's, that's the nature of Google products. In other words, like Stitch, Google AI Studio, they have so many different experiments or products that they are not very well connected, in my opinion.

  13. 56:0657:32

    Advanced Google AI Studio Tips

    1. AG

      become experts at using Google AI Studio?

    2. XM

      I think the... Something that I, if I only to have to say one tip, I would say you have the ability to type in a system instruction on the main page. So if you click here, it's sort of hidden, that you can type sips- system instruction here. So for example, you could be very specific about the style that you want it to generate, or you can be specific about other things. But this can provide some actual context. Similar with Lovable, it has the knowledge base, which is a similar idea, I would say. So that's Google AI Studio. And another thing I wanted to call out is that if you go to the interface, there is one thing which is Annotate App. It's similar with Visual Edit. It's the counterpart of Visual Edit in Google AI Studio, which is quite interesting. So if you go to add comment, you can drag things and make comments, and it's just like a built-in tool for you to add comments. Whatever those comments are, you can add to chat all together to make revisions.

    3. AG

      Nice. So you can kind of more simulate the multiplayer way you would normally iterate on a design in Figma here with AI.

    4. XM

      Uh, right. And by the way, I have a, I have this tool l- which allow me to annotate anyway before this feature was introduced. But again, it's an interesting feature in my opinion, which is different from many other

  14. 57:3259:37

    How Cursor Stacks Up

    1. XM

      tools.

    2. AG

      Yeah. We're talking about these visual editors, and I think one other sort of divergent tool we need to cover for people is Cursor, 'cause they just released their visual editor. So how would you say that stacks up against these tools that we've seen today?

    3. XM

      Yes. I've seen some exciting updates about Cursor over the past two months, you know, the browser tool, which is exciting, and it's launching new feature that is tailored more around visual, visual editing, which is also exciting. But, you know, I did a quick table just to explain my point. This may change depending on how fast tools move, but this is the current state. Like, in my opinion, I felt like Cursor still has a slow learning curve, even with all those little updates. It is not very friendly for non-technical folks or people with minimum technical experience to work, to work in still. Um, so that's Cursor. Regarding the quality, it can really goes from okay to great, depending on your skill sets or, like, how specific you, how specific you do things with, with Cursor. And in terms of speed, it's just because of Claude Artifact, which is, you know, artifact in claude.ai, make things really easy for people, and I consider it much faster than Cursor. Same with, you know, Stitch and V0. I consider them faster than Cursor, just because of fact that they work in a browser-based environment. You don't have to set up different things. Um, it's, it's faster that way. Uh, in terms of use cases, I found it helpful if you are really wanted to seriously build something, then Cursor, it's a well-rounded, flexible tool compared to those browser-based tools like Stitch, V0, Claude Artifact. So it's really pros and cons here, right? Some people want to have this kind of freedom to do things that they want, but a lot of people out there are just not, you know, they're still not comfortable with using Cursor, and it is okay. Then those tools are, you know, better for those

  15. 59:371:01:00

    Final Takeaways

    1. XM

      needs.

    2. AG

      Okay. Amazing. Wow, Xinran. This was a masterclass. If I had to sum it up for everybody who listened, we covered the four key areas of what it means to design with AI, and we deep dove into two really important workflows. Workflow number one, build a custom GPT, a Claude project, a Claude skill, or a Gemini gem to help you prompt your prototyping tool well, because that prompt sets the context. Workflow number two, use a tool like Google Stitch to dream up divergent solutions. What we saw today was when you use the YOLO mode, the redesign mode, features that they've built in, you can generate 15, 16 different ideas, and that's how you get alpha from using AI. You don't want to use AI to just do a worse job of what you would've done before. You want it to enable you with new superpowers, because the designers and PMs who have new superpowers from AI, they will replace the designers and PMs who are just using AI generically. It's not like design and PM are suddenly gonna get replaced overnight. It's that those people using AI well, like some of the workflows we demoed for you today, will replace others. So if you wanna use Xinran's custom GPT, the link is in the description below. It's also in my newsletter issue, where you can find all of the details of the workflows and the mind map that he shared today. Xinran, thanks for this masterclass.

    3. XM

      Thank you. Thank you for having me.

  16. 1:01:001:01:45

    Outro

    1. AG

      I hope you enjoyed that episode. If you could take a moment to double-check that you have followed on Apple and Spotify Podcasts, subscribed on YouTube, left a rating or review on Apple or Spotify, and commented on YouTube, all these things will help the algorithm distribute the show to more and more people. As we distribute the show to more people, we can grow the show, improve the quality of the content and the production to get you better insights to stay ahead in your career. Finally, do check out my bundle at bundle.aakashg.com to get access to nine AI products for an entire year for free. This includes Dovetail, Maven, Linear, Reforge Build, Descript, and many other amazing tools that will help you as an AI product manager or builder succeed. I'll see you in the next episode.

Episode duration: 1:01:56

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode IUvi2YHayS0

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome