Daniel Kahneman is the world’s leading authority on human error. He won the Nobel Prize in Economics in 2002 for his pioneering work – with his late partner, Amos Tversky – applying psychological insights to economic theory, particularly in the areas of judgment and decision-making under uncertainty. Kahneman and Tversky confirmed and examined the mental foibles and biases that beset each of us all day, every day, and how they impact our decisions.
It is one thing to recognize these cognitive difficulties, of course, and quite another actually to do something about them, as Kahneman readily concedes. Research evidence even suggests that being smarter, more aware, and more educated doesn’t seem to help us deal with these cognitive difficulties more effectively. They may actually make things worse.
For example, this study suggests that, in many instances, smarter people are more vulnerable to thinking errors, even basic ones. Moreover, “people who were aware of their own biases were not better able to overcome them.” That’s generally because smart people are clever enough to concoct plausible justifications for their preconceived notions despite powerful disconfirming evidence.
When we see evidence that tends to confirm what we already believe, we ask if it can be true. When faced with disconfirming evidence, we ask if it must be true, an entirely different and much lower standard.
I once asked Kahneman what we might do to mitigate our inherent weaknesses in this area. He chuckled and replied, “Not much.” He then paused, tilted his head, and added, “Ask the smartest and least empathetic people you know to tear your ideas apart.”
This edition of TBL will focus on where Kahneman’s advice comes from, what it means, and how we might put it into practice.
If you like The Better Letter, please subscribe, share it, and forward it widely. It’s free, there are no ads, and I never sell or give away email addresses.
Thank you for reading.
Addition by Subtraction
Not long after midnight one Friday morning in 2019, two men in a Chevy Cruze were stopped on the Black Bayou Swing Bridge in southwestern Louisiana for a passing boat. The bridge swings open on pontoons to allow vessels to get through on the Intracoastal Waterway. One of the men got out and raised the gate arm to allow the vehicle through. He hopped back in the car and, together, the two Texans sped up the ramp and tried to jump the 165-foot chasm of water.
It was supposed to turn out like this.
It did not. Not even close. The car plunged into the water and both occupants drowned. As they say, at birth, we look like our parents; at death, we look like our decisions.
Some might say it was addition by subtraction.
We humans are shockingly prone to bad ideas, ideas that grow into terrible decisions, and then metastasize into actions that undermine, damage, or even end our lives. We’d all like to think that we’re a lot smarter than the two Texans described above, and most of us surely are, but vanishingly few of us have a consistently good track record of decision-making and none of us is as good as we think we are. None of us is unhurt, unscathed, or unbroken.
As Koen Smets so pithily put it, “We are bamboozled by biases, fooled by fallacies, entrapped by errors, hoodwinked by heuristics, deluded by illusions.” Worst of all, these weaknesses are largely opaque to us. They leave no cognitive trace.
When Thomas Jefferson claimed that “we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it,” he was speaking aspirationally more than descriptively. As Tolstoy wrote, “everyone thinks of changing the world, but no one thinks of changing himself.” More to the point, we feel what we feel, and feelings can’t be fact-checked, even if we wanted to.
We humans can’t figure out how to get along with our neighbors and drown because we try to jump 165-foot gaps of water in a car.
There is much more to our story, however. We humans also cure disease, split atoms, and send rockets to Mars.
We are villains and heroes, dullards and geniuses, sinners and saints, common and extraordinary, wrong and right, sometimes all at once, sometimes selectively, and sometimes unknowingly. These often difficult human realities are central to who, what, why, and how we are. There is no avoiding them and no consensus about them.
Some see red flags everywhere. Others see a parade. We are both wrong and right.
The root of this problem should be obvious: We’re human. On most days, all too much of the time, we’re delusional, lazy, partisan, arrogant confabulators. On our best days, when wearing the right sort of spectacles, squinting and tilting our heads just so, we can be observant, efficient, loyal, assertive truth-tellers.
The questions are obvious even if the answers are not. How can we be truth-tellers more often and confabulators less often? How can we go about being less stupid and less wrong? How can we be more like Norman Borlaug (the agronomist whose discoveries sparked the “Green Revolution” and saved perhaps a billion lives) and less like the idiots who drove their Chevy Cruz off the Black Bayou Swing Bridge?
Kahneman offered the beginning of the answer with his pithy and witty response to me: “Ask the smartest and least empathetic people you know to tear your ideas apart.”
As the Nobel laureate Joseph Stiglitz has argued, global standards of living began to improve in the late 18th century and have continued to improve ever since for two primary reasons: the development of science and developments in social organization, most prominently the growth of free-market economies and the rule of law. Significantly, science and the rule of law both include carefully derived and articulated systems and methods of assessing and verifying reality and what it means, while markets provide their own weighing mechanisms.
In his famous 1974 Caltech commencement address, the great physicist Richard Feynman talked about the scientific method — a careful and consistent process designed to root out error — as the best means to achieve progress. Even so, notice what he emphasizes: “The first principle is that you must not fool yourself – and you are the easiest person to fool.”
Accordingly, the scientific method, despite its limitations and shortcomings, provides a solid demonstration that we may do irrational things sometimes, even oft times, but we aren’t irrational through and through. We may do crazy things, but we aren’t crazy. Moreover, it demonstrates that our biases and cognitive impairments can be mitigated, if not overcome.
Unfortunately, science works in a way that is difficult and counterintuitive for us. It offers “proof negative” – addition by subtraction.*
As humans, we want deductive (definitive) proof, which is rarely available (except in closed systems, like math) but must settle for inductive (tentative) theories. Induction is the way science works and advances. That’s because science can never fully prove anything. It analyzes the available evidence and, when the force of the evidence is strong enough, makes tentative conclusions, in an effort to ascertain the best available approximation of the truth.
These conclusions are always subject to modification or even outright rejection based upon further evidence gathering. The great value of evidence is not so much that it points toward the correct conclusion (even though it often does), but that it allows us the ability to show that some things are conclusively wrong. Never seeing a black swan among a million swans seen does not prove that all swans are white while seeing a single black swan (as in Australia) conclusively demonstrates that not all swans are white.
As much as we seek it, confirming evidence may add to the inductive case but doesn’t prove anything definitively. On the other hand, disconfirming evidence dispositively demonstrates what is false. Correlation is not causation and all that. Science progresses not via verification (which can only be inferred) but by falsification (which, if established and itself verified, provides relative certainty only as to what is not true).
Accordingly, we should think scientifically as fully as we can. We should carefully evaluate the available evidence. We should develop tentative conclusions. And we should keep seeking disconfirming evidence. “In all affairs, it's a healthy thing now and then to hang a question mark on the things you have long taken for granted,” as Bertrand Russell explained.
In related news, my reading of the Bible suggests that God doesn’t so much answer our questions as question our answers.
That said, we tend to neglect the limits of induction and jump to overstated conclusions, especially when they are consistent with what we already think. Few papers get published establishing that something doesn’t work. Instead, we tend to spend the bulk of our time looking (and data-mining) for an approach that seems to work or even for evidence we might use to support our preconceived notions. We should be spending much more of our time focused upon a search for disconfirming evidence for what we think.
As the great Charlie Munger famously said, “If you can get good at destroying your own wrong ideas, that is a great gift.” But we don’t often do that. Because of our general loss aversion, losses loom larger than gains. The things we lose weigh more than the things we gain.
Important new research from a University of Virginia team found that we are systematically biased toward adding rather than subtracting: “people systematically default to searching for additive transformations, and consequently overlook subtractive transformations.”
For example, an incoming leader of a major organization did a listening tour to ask employees and other stakeholders for their ideas to improve and re-shape the company. For every subtractive suggestion, there were eight additive suggestions. “Additive ideas come to mind quickly and easily, but subtractive ideas require more cognitive effort, explained study co-author Benjamin Converse.
Similarly, over 85,000 self-help books were published in 2019. Nearly all of them told us things to do, buy, or add. On the other hand, it took more than four decades to remove a single insane EPA classification of milk as an “oil,” which required every dairy to spend thousands of dollars in anti-milk-spilling prevention devices, costing consumers billions of dollars over that time (only about 180,000 pages of federal regulation to go).
A related concept is behavioral finance’s lead actor, confirmation bias, whereby we see what we want to see, accept those desires as truth, and act accordingly. Confirmation is common, of course, and exists throughout the world at large, not just within behavioral finance. As Annie Duke says, we’re built for false positives.
We quite naturally try to jam facts into our preconceived notions and commitments or simply miscomprehend reality such that we accept a view, no matter how implausible, that sees a different set of alleged facts, “facts” that are used to support what we already believe. So when we grab a glass of what we think is apple juice, take a sip, and find out it’s really ginger ale, we react with disgust, even when we love ginger ale.
On our better days, we might grudgingly concede that we hold views that are wrong. The problem is with providing current examples. We’re lousy at seeking and finding disconfirming information and, thus, at eliminating bad ideas.
Consider this variation of the Wason selection task. Note that the test subjects were told that each of the cards described below has a letter on one side and a number on the other.
Which two cards do you need to turn over to test the following statement? “A card that has a vowel on one side has an even number of the other side.” The four cards are E, K, 4, and 7.
A. E, 4 B. E, 7 C. K, 4 D. K, 7
Most people answer “A” — E and 4 — but that’s wrong. For the posited statement to be true, the E-card must have an even number on the other side of it and the 7-card must have a consonant on the other side. It doesn’t matter what’s on the other side of the 4-card. Yet we turn the 4-card over because we intuitively want confirming (additive) evidence. And we don’t think to turn over the 7-card because we tend not to look for disconfirming (subtractive) evidence, even when it would be “proof negative,” establishing that a given hypothesis is incorrect. In a variety of test environments, fewer than 10 percent of people get the right answer to this type of question.
Addition by subtraction seems to be contrary to our natural defaults.** It explains why many people find it hard to manage hectic workloads and why we continue to damage our planet. We are mostly blind to opportunity costs. As with the old infomercials (“But wait, there’s more!”), when considered more broadly, the addition by subtraction principle works in a surprising array of additional contexts and situations.
The scientific findings are new, but the concept is not. Ancient Chinese philosopher Lao Tzu noted that we add things every day to obtain knowledge. To attain wisdom, we should subtract things every day. The Ten Commandments – 2,500 years or so earlier – proscribe rather than prescribe behavior.
The scientific method, which advances via falsification, dates to the 17th Century. Occam’s Razor (“entities should not be multiplied unnecessarily”) is named for the 14th Century English philosopher and theologian, William of Occam, who embraced it. Occam’s rule of parsimony inspired 19th Century Scottish philosopher, Sir William Hamilton, to link it with the idea of cutting away extraneous material, giving us the modern name for the principle.
In Skin in the Game, Nassim Taleb made a persuasive case for the “Via Negativa.”
“The principle that we know what is wrong with more clarity than what is right, and that knowledge grows by subtraction.”
This subtractive epistemology is drawn from Proclus and from a similar method used in apophatic theology, drawing on Gregory of Nyssa and Aquinas that focuses on what cannot be said about God. Because understanding God is deemed beyond human capability, the apophatic attempts to approach God by negation – by focusing on what He is not, rather than what He is. Focusing more on what goes wrong and why than upon what works, what Harvard’s Atul Gawande calls “the power of negative thinking,” is a modern iteration.
Most people try to improve by addition. I say do the opposite. Remove things. “Filter” is the right word: distill, distill, and keep distilling. We all know when we are eating poorly, reading garbage, or wasting time. Remove what we know is wrong. The more noise we eliminate, the easier it is to find the signal.
Here is a brief top ten list for applying the addition by subtraction principle.
1. Focus on Eliminating Mistakes. This is the most obvious conclusion. That making fewer errors matters more than equivalent advancement is mathematically demonstrable. For example, NFL teams that turn the ball over less win nearly 80 percent of the time. Incentives to eat healthily aren’t as effective as removing bad eating choices from your living space. We all recognize that eliminating mistakes is a good thing. However, as the psychologist Daniel Gilbert has observed, disbelieving is very hard work.
2. Make Fewer Decisions. Since our decision-making is innately flawed and prone to error (on account of both bias and noise), one additional conclusion is that we should make fewer decisions. Kahneman made the following helpful suggestion.
“Algorithms beat individuals about half the time. And they match individuals about half [the] time.* There are very few examples of people outperforming algorithms in making predictive judgments. So when there’s the possibility of using an algorithm, people should use it. We have the idea that it is very complicated to design an algorithm. An algorithm is a rule. You can just construct rules.”
We should create automated decision engines and implement them whenever and wherever possible. And if we cannot create an automated, algorithmic response, we should try to simulate one.
3. Build “Slack” into Your Life. Slack – the absence of binding constraints on behavior – is a very good thing. When you have sufficient slack, it’s easier to remove ideas, barriers, and stuff that isn’t helpful. And when we have less, we are more productive with what remains. The Biblical concept of Sabbath rest applies here, too, as do secular equivalents.
4. Keep Your Options Open. We should avoid what Amazon calls “walking through a one-way door.” Amazon tries to avoid making “choices that are hard to reverse or extend” – an idea similar to the medical principle, “First, do no harm.”
5. Focus on Simplicity and Elegance. Mathematicians and physicists seek simple elegance. Musicians and playwrights use silence to increase dramatic tension. Steve Jobs and Apple revolutionized the cellphone by removing the physical keyboard. Instagram succeeded only when it was stripped of features and reworked to focus on just one thing: photographs. Marie Kondo implores: “Keep only those things that speak to your heart. Then take the plunge and discard all the rest.”
6. Less is Often More. What isn’t there can be more important than what is. Think “margin for error” or Benjamin Graham’s famous “margin of safety.” It allows for relaxation, rest, play, contemplation, and patience.
The management educator Jim Collins recommends a “stop doing” list.
“A great piece of art is composed not just of what is in the final piece, but equally important, what is not. It is the discipline to discard what does not fit — to cut out what might have already cost days or even years of effort — that distinguishes the truly exceptional artist and marks the ideal piece of work, be it a symphony, a novel, a painting, a company or, most important of all, a life.”
The best book on writing I know (the ubiquitous Strunk & White) made the same point long before Collins and much more economically: “Omit needless words.” Or, as Blaise Pascal famously wrote in a letter three centuries earlier: “Je n’ai fait celle-ci plus longue que parce que je n’ai pas eu le loisir de la faire plus courte” (“I have made this longer than usual because I have not had time to make it shorter”).***
Too many fund choices inhibit 401(k) participation. Costco’s value proposition involves far fewer choices. Less spending = more money. If invested, thanks to the power of compounding, the amount saved grows exponentially over time. And, since debt at interest involves reverse compounding, reducing debt is generally beneficial, too.
Less is often more.
7. Eliminate Noise. Distinguishing signal from noise can be agonizingly difficult. Given the sheer amount of stuff competing for our attention, eliminating distractions unlikely to provide substantive benefit will improve the likelihood of our success. CNBC is fun and all, but how often does it make us smarter or better? “Noise” is a much broader concept, of course, and an enormous problem. It’s far too big a subject to flesh out here but, since bias and noise are roughly equivalent contributors to error, eliminating it would have a huge impact.
8. Be Less Sure of Yourself. As my friend Brian Portnoy says (citing Voltaire), “Doubt is not a pleasant condition, but certainty is an absurd one.”
9. Be Less Full of Yourself. Like certainty, ego often gets in the way of our doing and being better. As Brent Beshore advises, to get humility, do service. “The great paradox of life is self-sacrificial service. More I give, with no expectation of reciprocity, the better life goes for others and me.” That idea, consistent with addition by subtraction, is counterintuitive and countercultural. Serve (especially the suffering). Suffering makes us face our fears and shows us how dependent we all are. Humility, including intellectual humility, makes us better. As my friend Mark Bassett says, donating money to help and serve others is great, but if you want to be changed, put your feet where your money is.
10. Beware Over-Reliance on Self-Reliance. In order to improve our chances of success, we need both outside input and an “outside view.” It involves Kahneman’s brutal and unempathetic critics tearing their friends’ ideas apart. We also need what Michael Mauboussin calls the “outside view.” It requires that we expand the reference class beyond our comfort zone and our personal experience. We need a much bigger sample size from which to acquire data. In essence, we need an empowered devil’s advocate or a team committed to careful examination and criticism. We should collaborate – especially with people who have very different ideas (what Kahneman calls “adversarial collaboration”). And we should build in robust accountability mechanisms for ourselves and our overall process.
When we seek to change objects, ideas, and situations, we routinely add incentives and fail to consider what barriers we might remove. Addition by subtraction is perhaps the greatest possible inversion.
When seeking improvement, our natural inclination is to add stuff (more tasks, more content, more objects). We routinely overlook solutions that involve subtraction – doing or having less.
Anna Keichline, Pennsylvania’s first female architect, invented the cement block. She realized the solid blocks that builders had used until her time were wasteful because their strength is in their walls, not their center. So, she made blocks that eliminated the cement center. They were cheaper, lighter, easier to handle and transport, and easier to assemble.
May we do the same kind of thing. Relentlessly.
* The rule of law works that way, too, via an adversarial system that relies upon cross-examination.
** Significantly, there is no reason to think that subtraction is inherently better than addition, even though it is generally cheaper and more efficient. When considering a problem, both subtractive and additive responses and solutions should be considered.
*** However, longer posts get linked to and shared more often.
Totally Worth It
After 71 consecutive issues of TBL without a break, I didn’t get last week’s issue done. Many of you expressed kindness about my failure and others expressed concern for my being missing. All of you who reached out have my thanks.
Welcome, also, to the several hundred new subscribers who signed up at the suggestion of Jonathan V. Last and his Newsletter of Newsletters. I hope to live up to your expectations.
This is the best thing I saw or read since the last TBL; this was also great. The most helpful. The most terrifying. The most impressive. The saddest. The most predictable. The most disappointing. The scariest. The most discouraging. The coolest. The weirdest. The most fun. The most bizarre. The most insane. The most fascinating. Yikes. Karma.
Please contact me via rpseawright [at] gmail [dot] com or on Twitter (@rpseawright) and let me know what you like, what you don’t like, what you’d like to see changed, and what you’d add. Don’t forget to subscribe and share.
Of course, the easiest way to share TBL is simply to forward it to a few dozen of your closest friends.
The last issue of TBL explored the power of randomness in our lives and in the world. Because randomness is so influential, we are better served to focus on our processes more than our outcomes because we have so much greater control of our processes.
Christianity is well-versed in a variation of that idea because it is all about process. Our lives are not our own and we are not promised good outcomes in this life. As Shadrach, Meshach, and Abednego proclaimed when threatened with execution by being burned alive: “King Nebuchadnezzar, we do not need to defend ourselves before you in this matter. If we are thrown into the blazing furnace, the God we serve is able to deliver us from it, and he will deliver us from Your Majesty’s hand. But even if he does not, we want you to know, Your Majesty, that we will not serve your gods or worship the image of gold you have set up.”
This week’s benediction sings of the One who offers hope for those without it.
Thanks for reading.
Issue 72 (July 30, 2021)