5 Heuristics and biases
Humans have two separate processes for understanding information, which Kahneman (2013) labels as system one and system two. If we are to find common ground, and move our audience to a new understanding for decisionmaking, we must understand how they think. Intuitive (system one) thinking — impressions, associations, feelings, intentions, and preparations for actions — flow effortlessly. This system mostly guides our thoughts, as illustrated next. Most of us immediately sense emotion from the face below, system one processing, but would need to work hard to mentally calculate 17 x 24, system two processing.
System one uses heuristics, biases. Reflective (system two) thinking, in contrast, is slow, effortful, and deliberate. Both systems are continuous, but system two typically monitors things, and only steps in when stakes are high, we detect an obvious error, or rule-based reasoning is required. For a sense of this difference, Kahneman provides exemplary information that we process using system one, as in the above image, and system two, as in mentally calculating 17 x 24. For other examples, consider Figure 4.2 (system one) and@fig-drownplasticsankey (processing may depend on familiarity with the graphic — a alluvial diagram — and which comparisons are of focus within the graphic).
On how humans process information, we have decades of empirical and theoretical research available (Gilovich, Griffin, and Kahnman 2009), and theoretical foundations have long been in place (Miller and Gelman 2020).
Kahneman, Lovallo, and Sibony (2011) gives executives ways to guard against some biases by asking questions and recommending actions:
self-interested biases | Is there any reason to suspect the team making the recommendation of errors motivated by self-interest? Review the proposal with extra care, especially for over optimism.
the affect heuristic | Has the team fallen in love with its proposal? Rigorously apply all the quality controls on the checklist.
groupthink | Were there dissenting opinions within the team? Were they explored adequately? Solicit dissenting views, discreetly if necessary.
saliency bias | Could the diagnosis be overly influenced by an analogy to a memorable success? Ask for more analogies, and rigorously analyze their similarity to the current situation.
confirmation bias | Are credible alternatives included along with the recommendation? Request additional options.
availability bias | If you had to make this decision in a year’s time, what inform-ation would you want, and can you get more of it now? Use checklists of the data needed for each kind of decision.
anchoring bias | Where are the numbers from? Can there be … unsubstantiated numbers? … extrapolation from history? … a motivation to use a certain anchor? Re-anchor with data generated by other models or benchmarks, and request a new analysis.
halo effect | Is the team assuming that a person, organization, or approach that is successful in one area will be just as successful in another? Eliminate false inferences, and ask the team to seek additional comparable examples.
sunk-cost fallacy, endowment effect | Are the recommenders overly attached to past decisions? Consider the issue as if you are a new executive.
overconfidence, optimistic biases, competitor neglect | Is the base case overly optimistic? Have a team build a case taking an outside view: use war games.
disaster neglect | Is the worst case bad enough? Have the team conduct a premortem: imaging that the worst has happened, and develop a story about the causes.
loss aversion | Is the recommending team overly cautious? Align incentives to share responsibility for the risk or to remove risk.
We increase persuasion by addressing these issues in anticipation that our audience will want to know. It’s very hard to remain aware of our own biases, so we need to develop processes that identify them and, most importantly, get feedback from others to help protect against them. Get colleagues to help us. Present ideas from a neutral perspective. Becoming too emotional suggests bias. Make analogies and examples comparable to the proposal. Genuinely admit uncertainty in the proposal, and recognize multiple options. Identify additional data that may provide new insight. Consider multiple anchors in a proposal.