Algorithms are optimized for engagement. Engagement means showing you more of what you already like. More of what confirms what you already believe. More of what keeps you scrolling.
The algorithm knows you. It knows you too well. It shows you a refined version of yourself, over and over, until your world shrinks to a dot.
That’s the curation problem. And it’s getting worse.
The Anti-Curator
What if AI could do the opposite?
Not optimized for engagement. Optimized for expansion.
The anti-curator would:
- Show you things outside your bubble
- Introduce ideas that challenge your assumptions
- Surface content from perspectives you haven’t considered
- Filter for “this will make you think” instead of “this will keep you engaged”
It’s a fundamentally different objective function. Engagement is easy to measure. Expansion is hard.
Why It Matters
We’re living in filter bubbles we didn’t choose. We think we’re exploring the world, but we’re actually consuming a personalized feed designed to keep us comfortable.
Comfort is the enemy of growth. Growth requires discomfort. Discomfort requires encountering things you didn’t seek out.
An AI that understands your beliefs could systematically challenge them. Not to confuse you - but to make you more right. More nuanced. More aware of what you don’t know.
The Implementation Problem
This is hard to build:
- What’s “expansion” vs “noise”?
- How do you measure if someone learned something vs just got annoyed?
- Who decides what counts as “should know”?
These are real challenges. But the curation problem is real too. And right now, the only thing optimizing for expansion is… luck. Chance encounters. A friend who sends you something outside your lane.
What if AI could manufacture those chance encounters?
The Vision
The anti-curator isn’t about entertainment. It’s about becoming less wrong.
It would show you:
- The strongest arguments from positions you disagree with
- Facts that challenge your model of the world
- Perspectives from people living different lives
- Ideas that are important regardless of whether they’re engaging
The goal isn’t to make you feel good. The goal is to make you more accurate.
That’s worth building.