
Build the Truth Stack
In many ways we’ve lost it and now need to build it back.
Don’t allow anyone to monopolize it
Truth can’t survive if one powerful person or group has a monopoly on what we think is true.
Guard it with everything
We loose what we’re not willing to fight for.
Truth Infrastructure: How Societies Stay Sane
Civilizations don’t collapse because they run out of opinions. They collapse because they run out of error correction. Wisdom isn’t a vibe; it’s plumbing. When the pipes that carry truth get clogged—by vanity, censorship, or pure noise—the whole building starts to smell, then burn.
We talk about “the marketplace of ideas” as if truth just naturally emerges from chatter. Markets only work with infrastructure: roads, ledgers, audits, penalties. Culture is no different. If we want a society that finds reality faster than it breaks things, we have to build the truth stack on purpose.
What actually keeps a culture from lying to itself?
Not slogans. Systems. The places that handle danger best—aviation, nuclear power—don’t hire super-geniuses and hope. They build layers that catch human error. Pilots use checklists. Engineers log near-misses. Everyone writes postmortems with the same solemnity a priest reserves for confession. When something goes wrong, the ritual is not “find a scapegoat,” it’s “find the mechanism.”
Good cultures make being wrong cheap and early. Bad cultures make it unthinkable until it’s fatal.
The epistemic stack (a design spec you can implement)
Layer 1: Virtues.
Humility, curiosity, courage. Boring? Only until you need them. Humility admits uncertainty; curiosity seeks disconfirming evidence; courage says “I was wrong” in public. This isn’t moral frosting—it’s operational necessity.
Layer 2: Norms.
Talk like you might be mistaken. Steelman before you critique. Show your sources. Separate people from proposals. If you can’t pass the other side’s Ideological Turing Test, you haven’t heard them; you’ve heard yourself in a different hat.
Layer 3: Mechanisms.
This is the plumbing. Build it, or your values are theater.
- Open data by default. If policy is based on numbers, the numbers are public. No PDF cryptograms; raw tables, code, and assumptions on display. Sunlight kills cargo cults.
- Prediction with skin in the game. Track forecasts for leaders, pundits, agencies. No more cost-free certainty. Even small-stakes prediction markets or scorecards force honesty: if you can’t risk points, don’t risk lives.
- Adversarial collaboration. When experts disagree, make them co-author the experiment or review. You extract truth from the tension instead of extracting tweets.
- Replication bounties. Pay people to replicate high-stakes findings; celebrate corrections as wins, not scandals. Incentives are ethics with receipts.
- Checklists & pre-mortems. Before a big decision, ask, “If this fails embarrassingly in six months, what will the headline say?” Then fix the headline now.
Layer 4: Institutions.
Plural, competing, and memory-rich. Monopolies on truth rot. Encourage parallel guilds (universities, think tanks, independent auditors) and force regular postmortems after policy faceplants. Amnesty for early whistleblowers; penalties for data hoarders.
Layer 5: Rhythm.
Not everything should move at the speed of the timeline. Create “slow lanes” for high-stakes choices where delay is safety—ethics boards with teeth, cooling-off periods, multi-pass reviews. Speed is intoxicating; rigor is life support.
Status vs. signal
A society obsessed with status performs for applause; a society tuned to signal performs for reality. The first optimizes spectacle and scapegoats. The second optimizes measurement and correction. The internet didn’t break truth; it just made status cheap and signal hard to hear. The fix isn’t to go hushed and monastic; it’s to turn status into signal:
- Reward accurate, early warnings, not just flashy “I told you so” essays after the crash.
- Make credibility cumulative and retractable: gains from correct calls, losses from confident whiffs.
- Turn dissent from treason into duty. A culture with no licensed heretics has no licensed future.
“Won’t this just be gamed?”
Of course. Everything with incentives gets gamed. That’s why you layer defenses. Open data is gamed? Replication catches it. Prediction markets prone to herding? Weight them with historical accuracy. Adversarial collabs too cozy? Rotate teams and publish the disagreements. The goal isn’t a system that can’t be gamed; it’s a system that makes gaming expensive and self-corrects faster than it drifts.
Meaning with guardrails
We need meaning. It’s the story that tells us what counts as a win. But meaning without guardrails becomes purity theater and apocalypse cosplay. Guardrails without meaning become bureaucracy with a pulse. Healthy cultures bind purpose to process: a clear “why” tethered to a ruthless “how do we know?”
When a society does this well, you feel it: debates are sharp but not cruel; leaders change their minds in daylight; failures become architecture, not shame. When a society refuses, you feel that too: nostalgia as policy, vibes as evidence, and reality collecting its bill in famines, panics, and wars.
A simple civic rule you can teach a teenager
- Measure something that matters.
- Make a prediction with a date on it.
- Stake a point of reputation.
- Invite someone who disagrees to co-audit.
- Publish what you learned, including the part that stung.
Do that in schools, labs, newsrooms, and city councils and you’ll manufacture an unfair advantage: faster contact with the way things are. That’s all truth is in practice—contact. Cultures that maintain that contact get to keep the future. Cultures that don’t get stories, until the stories run out.
We don’t need a thousand new opinions. We need instruments. Build the stack, protect the heretics, and keep the pipes clear. Everything you love about civilization—from hot showers to honest courts—rides on whether we treat truth like a public utility or an optional accessory.
Add-on: The Four Predators of the Truth Stack (and quick antidotes)
Scientism (science as totalizing creed)
- Failure mode: Treats models as metaphysics; smuggles “ought” from “is.”
- Antidote: Boundary humility. Every high-stakes brief ships with a Model Risk box: domain, assumptions, failure cases, what the model cannot answer (values, purposes). “Science = method” is our creed; everything else is philosophy or theology—label it.
Dogmatism (ideology with earplugs)
- Failure mode: Purity tests, frozen priors, dissent as treason.
- Antidote: Licensed heresy. Rotate a devil’s-advocate seat; require adversarial collaborations for contested claims; mandate postmortems with named errors. Reward mind-changes in public.
Noise saturation (attention economy eating cognition)
- Failure mode: Outrage throughput > signal; we confuse volume for truth.
- Antidote: Noise budgets. Institutions set a maximum “comm cycles per decision”; individuals keep one slow lane (weekly quiet hour; read before react). Promote “first measures, then opines.”
Synthetic reality (deepfakes, AI remixes, AR warps)
- Failure mode: Evidence becomes deniable; lies become photoreal.
- Antidote: Provenance by default. Adopt C2PA-style cryptographic signing for images/audio/video; require chain-of-custody logs for journalistic and legal media; force tools to emit detectable watermarks; teach “trust the source trail, not the pixels.”
Drop-in box: Science vs. Scientism (clarifier)
- Science: a disciplined way of testing claims against the world.
- Scientism: the unfalsifiable belief that only science yields real knowledge.
Policy pledge: “We use science to inform values; we don’t pretend it replaces them.”
Drop-in: Status → Signal Converters (practical)
- Prediction scorecards for pundits/officials (wins raise voice; whiffs lower it).
- Replication bounties for high-impact claims.
- Open data + code for any number used to justify policy.
- Pre-mortems before big bets: “Headline if this fails in 6 months?” Fix it now.
A final thought
Don’t let algorithms set your beliefs: measure something, make a dated prediction, stake a point of reputation, invite a critic to co-audit, publish what stung.