No Priors

No Priors Ep. 21 | With Datadog Co-founder/CEO Olivier Pomel

Elad Gil and Olivier Pomel on datadog CEO on AI, observability, security, and disciplined hypergrowth strategy.

Elad GilhostOlivier PomelguestSarah Guohost
Jun 15, 202344m
Founding story of Datadog and dev–ops collaboration originsBuilding an infrastructure company from New York and its advantagesDatadog’s unified observability and security platform and product philosophyImpact of generative AI and LLMs on workloads, tooling, and observabilityDatadog’s approach to LLM/AI monitoring, MLOps/LLMOps, and automationExpansion into security and rethinking how security outcomes are deliveredM&A strategy, platform integration, and leadership/operating principles for durable growth

In this episode of No Priors, featuring Elad Gil and Olivier Pomel, No Priors Ep. 21 | With Datadog Co-founder/CEO Olivier Pomel explores datadog CEO on AI, observability, security, and disciplined hypergrowth strategy Olivier Pomel, Datadog’s co-founder and CEO, traces the company’s origins from bridging dev–ops culture gaps to becoming a unified observability and security platform at massive scale. He explains how building from New York, staying close to customer reality, and designing a single integrated platform underpins Datadog’s broad product expansion and efficient, near-profitable growth. A significant portion of the discussion focuses on generative AI: its impact on software workloads, developer productivity, observability, and the emerging LLM tooling stack, as well as Datadog’s cautious, outcome‑driven use of AI in its products. Pomel also details Datadog’s approach to acquisitions, security, customer segmentation, and leadership practices that sustain execution through changing macro and technological environments.

At a glance

WHAT IT’S REALLY ABOUT

Datadog CEO on AI, observability, security, and disciplined hypergrowth strategy

  1. Olivier Pomel, Datadog’s co-founder and CEO, traces the company’s origins from bridging dev–ops culture gaps to becoming a unified observability and security platform at massive scale. He explains how building from New York, staying close to customer reality, and designing a single integrated platform underpins Datadog’s broad product expansion and efficient, near-profitable growth. A significant portion of the discussion focuses on generative AI: its impact on software workloads, developer productivity, observability, and the emerging LLM tooling stack, as well as Datadog’s cautious, outcome‑driven use of AI in its products. Pomel also details Datadog’s approach to acquisitions, security, customer segmentation, and leadership practices that sustain execution through changing macro and technological environments.

IDEAS WORTH REMEMBERING

7 ideas

Dev–ops collaboration, not metrics, was Datadog’s original core problem to solve.

Datadog began as a way to get development and operations teams seeing the same reality and working together, and only later evolved into the full observability platform most people recognize today.

Operating from New York forced capital efficiency and closer alignment with real customer needs.

Skepticism from Bay Area investors and a smaller local deep-tech pool led Datadog to run near-profitable from early on, focus obsessively on product–market fit, and benefit from higher employee retention versus the Bay Area.

A unified platform is Datadog’s main strategic moat but requires heavy ongoing investment.

Roughly half the company works on the core platform, and every acquisition is re-platformed in year one, which is costly but critical to delivering deeply integrated, end‑to‑end workflows across many product areas.

Generative AI shifts value from writing code to understanding, operating, and securing it.

As developers become far more productive and write more software they understand less deeply, demand grows for tools that help observe, debug, secure, and manage increasingly complex, AI‑augmented systems—exactly the layer Datadog serves.

LLMs open new observability use cases but don’t replace precise numerical methods.

Datadog still relies on classical statistical and numerical models for anomaly detection and alerting, while using LLMs to combine heterogeneous data (metrics, logs, state, docs) into richer insights and explanations where fuzziness is acceptable.

Datadog’s security strategy bets on ubiquity and developer‑first workflows, not just CISO sales.

Pomel argues there’s no shortage of security technology, but poor outcomes; Datadog aims to embed security everywhere via its existing agent footprint and developer adoption, like delivering security as an “IV” rather than organ‑by‑organ injections.

Disciplined profitability mindset and fast iteration on new tech underlie Datadog’s resilience.

The company has always optimized for efficient growth rather than growth at all costs, while now asking teams to accept more experimentation and noise around generative AI to keep pace with a rapidly moving ecosystem.

WORDS WORTH SAVING

5 quotes

The starting point for Datadog was not monitoring or even the cloud; it was, “Let’s get dev and ops on the same page.”

Olivier Pomel

If one person is ten times more productive, they’ll write ten times more stuff—but they’ll understand what they write ten times less.

Olivier Pomel

Everybody’s buying security software. Nobody is more secure as a result.

Olivier Pomel

There are great medicines today for security, but for them to work you need to inject them in every single one of your organs every day. We want to deliver it to you in an IV.

Olivier Pomel

With LLMs we clearly have ignition. We might have liftoff soon. The question is whether we need a second stage.

Olivier Pomel

QUESTIONS ANSWERED IN THIS EPISODE

5 questions

How might Datadog’s approach to AI change if open-source LLMs fully match or surpass proprietary frontier models for most enterprise use cases?

Olivier Pomel, Datadog’s co-founder and CEO, traces the company’s origins from bridging dev–ops culture gaps to becoming a unified observability and security platform at massive scale. He explains how building from New York, staying close to customer reality, and designing a single integrated platform underpins Datadog’s broad product expansion and efficient, near-profitable growth. A significant portion of the discussion focuses on generative AI: its impact on software workloads, developer productivity, observability, and the emerging LLM tooling stack, as well as Datadog’s cautious, outcome‑driven use of AI in its products. Pomel also details Datadog’s approach to acquisitions, security, customer segmentation, and leadership practices that sustain execution through changing macro and technological environments.

What concrete steps can organizations take today to prepare their observability and security practices for a world where most code is AI‑assisted or AI‑generated?

Where is the line between helpful AI‑driven automation in operations and risky over‑reliance on systems that still have non‑trivial error rates?

How will Datadog balance developer‑first, bottom‑up security adoption with the top‑down purchasing and compliance realities of large enterprises?

If unified observability eventually collapses multiple categories into one, how should buyers rethink their tooling strategies and vendor selection over the next five years?

EVERY SPOKEN WORD

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome