The Illusion of Apple's AI Research

The Illusion of Thinking… or Just a PR Stunt?

Welcome to Brain Farts — where critical thinking meets critical gas.

This week, we dive into one of the spiciest tech showdowns of 2025: Apple’s AI research paper *“The Illusion of Thinking.”* The claim? That AI models can’t actually reason. The reality? It might be Apple’s own reasoning that’s... a little off.

💥 Here’s what we’re unpacking in this episode:

- Did Apple just *accidentally* gaslight the entire AI field?
- Why the timing of the paper — right before WWDC — has conspiracy theorists (and academics) raising eyebrows.
- The role of influencers in hijacking the narrative before anyone read the paper (looking at you, LinkedIn).
- Why Gary Marcus popped champagne — and why researchers like Alex Lawsen immediately put it back in the fridge.
- The *actual* technical flaws in the paper — token limits, impossible puzzles, and unfair grading rubrics.
- What happens when corporate strategy collides with scientific integrity — and who ends up cleaning up the mess.

🤖 Spoiler alert: the real illusion might be thinking Apple was doing this for science.

Whether you're a researcher, a strategist, or just someone who yells at Siri for not setting timers correctly, this episode will make you question how tech narratives are shaped — and who benefits.

🧠 Plus: Our favorite critiques, sharpest clapbacks, and a winged brain cameo that (somehow) still makes more sense than Apple’s evaluation scripts.

---

🎧 Listen now. Think critically. And maybe don’t take every white paper at face value.

Have thoughts? Complaints? Brain farts of your own? Yell into your HomePod mini. We’re listening.
The Illusion of Apple's AI Research
Broadcast by