Survivor Bias
Remember the Losers
We are biased thinkers. Not irrational, but efficient. We rely on rules of thumb (heuristics) that work most of the time. But when our biases fail, they fail in predictable ways.
I once tried to catalog common biases and fell into one myself. I assumed my list was complete. It wasn’t. I stopped looking too soon—a very common bias.
One heuristic that we often miss is survivorship bias, which is perhaps best explained in Jordan Ellenberg’s How Not to Be Wrong: The Power of Mathematical Thinking.
During World War II, analysts studied returning bombers to determine where to add armor. They mapped bullet holes and found the heaviest damage on the fuselage. The obvious solution: reinforce the fuselage.
But a group of mathematicians saw the flaw. These planes were the survivors. The planes that didn’t return—those hit in more vulnerable spots—weren’t in the data.
Their recommendation? Armor the engines. Why? Because planes hit in the engine didn’t make it back. They didn’t survive. The Army followed this advice and likely saved thousands of lives.
This mistake shows up everywhere.
Imagine evaluating mutual funds from 2015 to 2025. You track the ones that exist at both points and conclude the category performs well. But what about the funds that disappeared along the way? Include them, and the results may look far worse.
The pattern is simple: we study winners and ignore losers.
We read about successful entrepreneurs, not failed ones. We admire long-lasting companies, not the many that quietly vanished.
The lesson is simple: your data set is lying to you if it excludes the failures.
If you want better ideas—about business, investing, or life—remember to include the non-survivors.


I really liked your Survivor Bias article. Thanks for recommending the book How Not to be Wrong, I just ordered it and can’t wait to read it.
Your critical thinking blog is so relevant today, when people just assume whatever the “news” pushes to them is true!