
E147: TED goes woke, Canada's Nazi blunder, AI adds vision, plus: who owns OpenAI?
Jason Calacanis (host), Coleman Hughes (guest), Chamath Palihapitiya (host), David Sacks (host), David Sacks (host), Jason Calacanis (host), Chamath Palihapitiya (host), David Friedberg (host), David Friedberg (host), Coleman Hughes (guest), Narrator
In this episode of All-In Podcast, featuring Jason Calacanis and Coleman Hughes, E147: TED goes woke, Canada's Nazi blunder, AI adds vision, plus: who owns OpenAI? explores all-In crew blasts TED’s ‘wokeness’, Canada’s Nazi gaffe, AI future The episode centers on Coleman Hughes’ controversial TED talk advocating colorblindness, detailing how TED staff tried to suppress and condition its release, which the hosts frame as an example of institutional capture and the decline of open discourse. The discussion broadens into how organizations like TED and major corporations are being driven by internal activist employees, and whether such institutions can be reformed. They then dissect Canada’s embarrassing parliamentary ovation for a former Nazi, tying it to incompetence, performative politics, and Western entanglement with far‑right elements in Ukraine.
All-In crew blasts TED’s ‘wokeness’, Canada’s Nazi gaffe, AI future
The episode centers on Coleman Hughes’ controversial TED talk advocating colorblindness, detailing how TED staff tried to suppress and condition its release, which the hosts frame as an example of institutional capture and the decline of open discourse. The discussion broadens into how organizations like TED and major corporations are being driven by internal activist employees, and whether such institutions can be reformed. They then dissect Canada’s embarrassing parliamentary ovation for a former Nazi, tying it to incompetence, performative politics, and Western entanglement with far‑right elements in Ukraine.
The conversation shifts to OpenAI’s structure, Sam Altman’s incentives, and the possibility that its nonprofit/“capped return” setup allows long‑term control and value capture while deflecting public backlash. Finally, they explore rapid advances in AI—especially multimodal models and agent-like operating systems—and predict a wholesale rewrite of human‑computer interaction, including AI-driven phone interfaces and ambient assistants.
Key Takeaways
Institutions can be quietly ‘captured’ from the bottom up by activist staff.
TED’s handling of Hughes’ talk shows how a small but vocal internal group can override an organization’s stated mission (ideas worth spreading), pressuring leadership to censor or condition content that merely offends, rather than rebutting it with argument.
Get the full analysis with uListen AI
Colorblindness still resonates widely, including among many people of color.
Despite being unfashionable in progressive circles, Hughes reports strong positive feedback from Black and immigrant attendees; the hosts argue that treating race as primary identity often inflames tensions and undermines individual merit and shared humanity.
Get the full analysis with uListen AI
Leaders who won’t enforce core values enable ideological drift.
Chris Anderson is portrayed as intellectually aware of free‑speech principles but unwilling to confront staff, illustrating how leaders can preside over mission drift when they prioritize internal comfort over institutional purpose.
Get the full analysis with uListen AI
Surface-level ‘anti-fascist’ signaling can coexist with dangerous sloppiness.
Canada’s Nazi ovation is framed as not just a vetting failure but the byproduct of a political culture obsessed with the “current thing” (supporting Ukraine) to the point of ignoring obvious historical context about who fought Russia in WWII.
Get the full analysis with uListen AI
OpenAI’s capped-return and nonprofit structure likely masks long-term control.
Sacks argues that investors and employees are capped at finite multiples, implying any trillion‑dollar upside reverts to the controlling nonprofit—giving Altman effective long‑run control and value capture while allowing him to say he owns no equity.
Get the full analysis with uListen AI
LLMs are evolving from chatbots into a new kind of operating system.
Friedberg stresses that multimodal AI (text, vision, audio, code) can sit at the center of decision-making—interpreting inputs, calling tools, and rendering outputs—potentially replacing app grids on phones with agentic, conversational interfaces.
Get the full analysis with uListen AI
AI will likely reshape memory, productivity, and daily interaction far beyond hype.
From externalized note‑taking and life‑logging to AI‑assisted driving and personalized agents, the hosts see current tools as early proofs that understate how completely AI could restructure work, communication, and personal organization.
Get the full analysis with uListen AI
Notable Quotes
“Color blindness is the idea that you want to treat people without regard to race, both in your personal lives and in our public policy.”
— Coleman Hughes
“As soon as you concede that there can be some sort of physical harm from engaging with ideas, you give the equivalent of a heckler’s veto… it’s almost like a crybaby’s veto.”
— David Sacks
“TED used to be a place for discourse, and it’s lost that, as have so many other forums for conversation in the society and country today.”
— David Friedberg
“If I had had your talk when I was 20 years old, I could have done so much more… I would have spared myself a lot of self-sabotage.”
— Chamath Palihapitiya
“We assume that if something is offensive by some group, it needs to be suppressed, and obviously as you extend that concept to its extreme, you end up losing many ideas that challenge the current kind of main concept that everyone believes.”
— David Friedberg
Questions Answered in This Episode
How should institutions like TED balance their stated mission of spreading challenging ideas with staff claims of feeling ‘unsafe’ or ‘hurt’ by certain viewpoints?
The episode centers on Coleman Hughes’ controversial TED talk advocating colorblindness, detailing how TED staff tried to suppress and condition its release, which the hosts frame as an example of institutional capture and the decline of open discourse. ...
Get the full analysis with uListen AI
Is colorblindness a realistic and just policy framework in societies with entrenched racial inequalities, or does it risk ignoring structural disadvantage?
The conversation shifts to OpenAI’s structure, Sam Altman’s incentives, and the possibility that its nonprofit/“capped return” setup allows long‑term control and value capture while deflecting public backlash. ...
Get the full analysis with uListen AI
At what point does internal employee activism become ‘institutional capture,’ and what practical steps can leaders take before they reach the point of no return?
Get the full analysis with uListen AI
Does the OpenAI capped-return model represent a more ethical form of value sharing, or is it primarily a sophisticated mechanism for centralized long-term control?
Get the full analysis with uListen AI
If LLMs evolve into core operating systems for phones and other devices, who should own and govern these AI ‘kernels’ that will mediate so much of human experience?
Get the full analysis with uListen AI
Transcript Preview
Hey, Coleman. How's it going?
Hey, Coleman. Welcome to the show.
You guys doing?
Hey. How's it going?
It's a real pleasure.
Have you ever heard of this show? (laughs)
Yeah, I have. I- I'm actually a fan. My girlfriend introduced me to the show, like, two years ago, and I've been-
Oh, nice.
... a fan ever since.
Great to meet you.
And, uh, apparently, like many women, she has, like, a... she has a legit concerning obsession with sax, uh, but also trumps.
No! Don't say it!
(laughs)
(laughs)
(laughs)
Gosh.
Oh my god. What on earth? Those sax fans- Me and Guevara are tilted. Those sax fans are crazy. End the episode. End the episode. Shout out, Letty. Shout out, Edith.
Oh my god. Oh my god.
(laughs)
Jesus.
Horrible.
Way to go, Coleman.
(laughs)
You would fit right in here. All right, here we go.
All right. (laughs)
Let me... Let me just... This here is your cold open, folks.
I'm sorry. I- I need... Let me just psychologically explore this before we get into the real substance of it. W- why does she like him so much? I don't understand this. (laughs)
By the way, I think you guys-
What is-
... missed the second half of my statement. I said sax and Tremont.
All right, y'all. Let's go.
Oh, shit. Okay, great.
Okay, great. All right. Well, now we're going. Okay.
Let's get to the, let's get to the nitty-gritty issue. Thank you.
(laughs) Thank God.
(laughs)
Okay, here we go.
Oh, thank God.
Three, two...
Don't let your winner slide. Rain Man, David Sachs. I'm going all in. And I said... We open sourced it to the fans and they've just gone crazy with it. Love you guys. Queen of Quinoa. I'm going all in.
All right, everybody. Welcome back to the All-In Podcast. We have a very full docket today. I thought we'd start with something pretty crazy. There was a really weird, uh, moment last week. TED threw one of its speakers under the bus, so we decided to have him on to talk about the experience. This is the second time they've done it, at least. They did it to Sarah Silverman for doing comedy at TED, because people at TED are a bunch of virtue signaling lunatics, including some of my friends (laughs) who go. But Coleman Hughes, uh, if you don't know him, is a writer and podcaster. He has a, a, a pretty popular podcast called Conversations with Coleman, and he did a talk, which I encourage everybody to watch, at TED, and it's titled A Case for Color Blindness. Uh, we all watched it. It's a very powerful talk, and something weird happened. Coleman, welcome to the program, and, um, maybe you could just share with the audience how you wound up speaking at TED, what the content, uh, of your talk was, briefly, and then the bizarre reaction when they tried to ban and kill your talk post you giving it. Yeah, so first, really glad to be on, guys. I'm a fan of the pod. So I'll give the short version here. If you want the long version, you can go to the Free Press, where I wrote a big, uh, summary of, of what happened there. Basically, what happened is Chris Anderson invited me to give a TED talk, and, uh, I chose the subject of my upcoming book, which is coming out in February, called The End of Race Politics, and the argument is just essentially color blindness. This is the idea that you want to treat people without regard to race, both in your personal lives and in our public policy, and wherever we have policies that are meant to collect and help the most disadvantaged, we s- we should preferentially use class as a variable rather than race. That's, that's my talk in a nutshell. So, I prepared the talk with the TED team. I got their feedback, edited, curated, et cetera. Got up there in April, gave the talk. 95% of the people in the audience, it was quite re- well received. Whether or not they agreed with every point, it was... they, uh, well within the bounds of acceptable discourse. There was a very small minority on stage, I could see, that was physically upset by my talk. On stage? Now, I, I could see this on stage, yeah, in the moment, but, um, I mean, I'm talking five people in a crowd of almost 2,000. So, I expected that because, you know, color blindness is not in vogue today on, on the left, uh, uh, on, amongst progressives. It's really the idea non grata. And so I was expecting to field some pushback, and I, I talked to some critics and so forth, but what happened is what began as just a few people upset began to spiral into a kind of internal staff meltdown at TED. So, this group called Black@TED asked to speak with me. I agreed, and then they said, "Actually, we don't want to talk to you," and they are an employee group at TED. After the conference, Chris emailed me and said, "Look, I'm getting, um, I'm getting a lot of blowback here internally. There are people saying we shouldn't release your talk at all." And then, over the course of the next month, they came up with a variety of sort of creative solutions about how to release my talk in a way that would appease the woke staffers that really didn't want it to be released at all. And at this point, I had to start kinda sticking up for myself. So first, they wanted to attach, uh, like, a, a debate to the end of my talk and release it as one video, which I felt would really send the wrong message. Hmm. It would send the message that, like, this idea can't be heard without the opposing perspective. Did they tell you what was problematic about your talk? No.
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome