How I AIHow this CEO turned 25,000 hours of sales calls into a self-learning go-to-market engine
CHAPTERS
Sales team can’t find answers—25,000 hours of calls become the source of truth
Matt Britton explains the core pain: sales and CS teams lacked easy access to customer insights and patterns. The breakthrough was realizing the company had amassed ~25,000 hours of recorded Gong calls, creating a high-signal dataset for a self-learning go-to-market engine.
Why Zapier became the automation backbone (and the AI tipping point)
Matt shares why Zapier works for him as a non-coder who likes stitching tools together. Once Zapier integrated AI/LLMs, it expanded from simple glue automation into an orchestration layer for sophisticated GTM workflows.
Start with the business problem, then choose tools and data
They emphasize avoiding “tool wandering” and instead identifying the single constraint holding growth back. Once the problem is clear, the right data sources and automations become obvious—calls were the missing link.
CEO hands-on building: the leadership skill shift in the AI era
Matt and Claire argue leaders must build and understand workflows themselves, not just delegate to engineering. Hands-on automation teaches practical AI fluency, improves judgment, and reduces dependency on “black box” estimates.
Triggering on a new Gong call: hacking the call ID feed with Browse AI
Matt walks through the hardest early step: getting a reliable trigger that captures each new call and its unique ID. Because Gong didn’t expose an easy path, he used URL patterns plus Browse.ai scraping to pull transcripts programmatically.
Cleaning and enriching transcripts: delays, formatting, and lookups
After scrape completion, the automation buffers with a short delay to prevent errors, then cleans HTML and normalizes text. Matt enriches the record by pulling additional context from Google Sheets and other systems to “round out” missing data.
LLM selection and operations: model choice, stability, and cost tradeoffs
Matt explains how he chooses models pragmatically: don’t change what works, but test outputs in ChatGPT across models for speed and quality. He also flags the operational challenge of maintaining many automations without perfect handoffs and documentation.
Core Summary Generator: call overview, sentiment scoring, and next steps
A central LLM prompt transforms each transcript into an actionable brief: participants, objectives, outcomes, sentiment, what went well, improvement areas, and next steps. The sentiment score becomes a measurable signal that can be benchmarked against churn and expansion outcomes.
Real-time Slack visibility: company-wide call intelligence + churn early warnings
The workflow posts summaries into Slack to create a live feed of customer reality—useful even for a 300-person company. Low sentiment scores route into a churn early-warning channel so leadership can intervene before problems become churn events.
From customer language to demand gen: extracting keywords for Google Ads
Matt shows how each call also fuels marketing: an LLM extracts high-intent terms customers actually use. Those terms are automatically added to Google campaigns, tightening the loop between voice-of-customer and paid acquisition.
AI coaching for Sales/CS: individualized feedback and trend tracking
The automation generates immediate coaching notes for the rep: strengths, weaknesses, and improvement suggestions. This feedback is also stored so managers can spot behavior patterns and make performance reviews more objective and continuous.
Human-in-the-loop follow-up email writer: faster, better post-call execution
To reduce the burden of post-call admin, the system drafts a high-quality follow-up email the rep can copy, edit, and send. Matt keeps a human approval step to prevent misfires and allow contextual judgment (timing, recipients, tone).
Building an aggregate customer profile database for RAG and prep
Beyond per-call actions, the workflow structures each call into a database: roles, product interests, trends, and use cases. Sales can then query aggregated patterns (e.g., what automotive brand managers care about) to prep smarter and standardize playbooks.
Redacted content engine: auto-generating SEO blog posts and ads from calls
The most ambitious workflow converts calls into anonymized, SEO-optimized blog posts that remove all identifying details. Posts publish after a delay (e.g., 21 days), scale into thousands of pages, and can feed dynamic search ads—turning conversations into reusable market-facing assets.
Org design implications and prompting approach: super ICs + guardrails
Matt closes on how automation changes hiring and ownership: fewer “order takers,” more proactive builders and operators, plus a small group of GTM automation orchestrators. For prompting, he uses a guardrail-first method—clearly stating goals and what the model must not do, then iterating toward the desired output.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome