Stay dangerously
informed.

You can't watch every episode, read every thread, and listen to every take. We do. Then we tell you what's worth your time, what to skip, and what connects across all of it.

The edge goes to whoever's paying attention. Here's what that looks like.

From:Frameup

When training GPT-2 costs $73 and building a team costs zero

20 sources this week, roughly 26 hours of content. Two patterns jumped out. Here's what matters.

What multiple sources are saying

The AI cost collapse is rewriting business models in real time

Karpathy showed GPT-2 training now costs $73. All-In debated what happens when inference hits $0.001/query. Ben Evans connected both into a framework: when AI is cheap, the moat isn't the model — it's the data flywheel and distribution. Worth noting: Ethan Mollick disagrees — argues the real disruption isn't cost but that AI changes which parts of a job are expensive, not which jobs disappear. Four sources, four angles, one conclusion — the economics of intelligence just broke.

Andrej Karpathy · All-In Podcast · Ben Evans · One Useful Thing

Solo founders aren't a meme anymore — they're an asset class

Levelsio crossed $3M ARR with zero employees and $8K/month in infra. Fireship profiled devs shipping production apps in days. Garry Tan shared that YC W25 is 60% solo founders. Lenny's guest from Notion said 30% of internal workflows that used to need purchased software are now built by non-engineers with AI. The pattern: practitioners with public output are becoming the trusted voices.

@levelsio · Fireship · Garry Tan · Lenny's Podcast

Source by source

14 posts + 1 thread + "Let's Build GPT-2" (4 hr tutorial)

GPT-2 training costs $73 now — that's not a typo

Nanochat can now train GPT-2 (the model OpenAI was afraid to release in 2019) for $73 on a single 8xH100 node. Two years ago this cost roughly $50,000. The 700x cost reduction happened through better software, not better hardware.data
His thread on AI agents spontaneously self-organizing on Reddit-like platforms got massive traction. Agents were forming teams, specializing, and coordinating — without being explicitly programmed to do so.novel
Pushed back hard on the narrative that startups can't compete with big labs: "I heard the same thing when OpenAI started. Google had all the data, all the talent, all the compute. OpenAI had a house in the Mission District."contrarian
His prediction: open-source models will match frontier closed models within 12 months, making the "API moat" irrelevant for most use cases.prediction
0:00$73 GPT-2 training breakdown2:12:45AI agents self-organizing on Reddit3:31:10Startups vs. big labs rant

I heard the same thing when OpenAI started. Google had all the data, all the talent, all the compute. OpenAI had a house in the Mission District and a conviction that architecture matters more than scale.

Andrej Karpathy

The $73 data point will show up in pitch decks for months. His contrarian take on startups vs. labs is worth tracking — he's been right about this before.

"E172: AI Inference Economics, TikTok, Macro Outlook" — 1 hr 48 min

The $0.001/query future: who wins when inference is basically free

Friedberg opened with data: inference costs drop roughly 10x every 18 months — faster than Moore's Law. Within 2 years, a GPT-4 quality query will cost a fraction of a cent.data
Sacks: "OpenAI is the new Intel. Valuable, yes. But the Dells and HPs captured more of the total value." Chamath disagreed — said OpenAI is more like Microsoft. The 15-minute argument is the highlight.contrarian
Chamath's claim: 80% of SaaS companies need to pivot to AI-native or die within 3 years. Sacks: "That's like saying 80% of companies will die because of the internet. True in a sense, but useless as advice."prediction
4:22Inference cost curve data18:35Sacks vs. Chamath: OpenAI as Intel or Microsoft32:1080% of SaaS dies in 3 years claim

OpenAI is the new Intel. Valuable, yes. But the Dells and HPs of the world captured more of the total value. Distribution eats models.

David Sacks

The inference economics segment (first 40 min) is the best discussion of this topic anywhere. The political half (last 50 min) is recycled. Start at 0:00, stop at 40:00.

"Why Everyone Is Wrong About the Recession" — 28 min

The recession indicators are broken — but he said this 3 weeks ago

Yield curve inverted for 18 months — longest without a recession in modern history. His argument: massive fiscal spending is offsetting the monetary tightening.data
Corporate earnings "beat expectations" but he shows the trick: companies sandbagging guidance means beats are just lowered bars. Actual earnings growth is flat.novel

Good analysis, but 80% of this was covered in his video 3 weeks ago. If you watched that one, skip this.

"Sam Altman: GPT-5, Sora, Board Saga, Elon Musk" — 2 hr 24 min

Sam Altman interview — long on vibes, short on signal

Altman confirms GPT-5 is "really good" and coming this year. No specifics on capabilities or timeline beyond that.prediction
The board saga discussion was carefully rehearsed — nothing new beyond what was in the NYT piece last month.consensus

2.5 hours for two data points you could get from headlines. Classic Lex — interesting guest, meandering interview.

"How Notion builds product" with Notion CPO — 1 hr 12 min

Notion's CPO on why 30% of internal tools are now built by non-engineers

Notion sees 30% of internal workflows that used to need purchased software now built by non-engineers using AI. They're designing for this shift.data
Their hiring bar: "Would this person be in the top 1% of their role at any company?" If not, they wait. They'd rather be understaffed than dilute quality.novel
Hot take on AI features: "Most AI features are solutions looking for problems. We only ship AI when it solves something users already struggle with."contrarian
12:3030% internal tools built by non-engineers34:15Top 1% hiring bar52:40AI feature philosophy

Most AI features are solutions looking for problems. We only ship AI when it solves something users already struggle with.

Notion CPO

The internal tools stat is the headline, but the hiring philosophy and AI feature skepticism are worth the listen if you're building product.

15 more sources in your full briefing

Simon WillisonWorth itAcquiredMust read@levelsioMust readGarry TanWorth itBen EvansWorth itEric BalchunasWorth itLyn AldenWorth itNo PriorsWorth itOne Useful ThingWorth itFireshipWorth itStratecheryMust reada16zWorth itMatt LevineMust readPaul GrahamWorth itPacky McCormickWorth it
View full briefing on web

This week's sharpest take

YC's latest batch is 60% solo founders. Not because founders can't find co-founders — because AI agents replaced the roles co-founders used to fill. The minimum viable team just dropped to one.

Garry Tan · @levelsio · Lenny's Podcast

via Frameup

Turn insights into action

Your voice. Your content.

Generate memos, talking points, and posts — tuned to your voice.

Voice Engine
FormalityFormal
CertaintyAssertive
EnergyEngaged

Generate a novel take on your briefing

**Strategic Intelligence: AI Cost Collapse**

Training costs for GPT-2 have fallen from $50,000 to $73 — a 700x reduction through software optimization.

**Implications:**
• Current AI vendor contracts are materially overpriced
• YC W25 is 60% solo founders — organizational models shifting

**Recommendation:** Initiate vendor contract review immediately.

That took 5 minutes to read.

26 hours of content. Synthesized.

Get your briefing

$15/mo · 7-day free trial