Published here October 2003.

Abstract | Introduction | Reason 1 | Status Quo Bias
Sunk Cost Bias | Supporting Evidence Bias | Framing Bias
Estimating and Forecasting Biases | Garbage In, Garbage Out  | PART II

Reason 1: Errors and Bias in Judgment

Why do we need decision aids? Can't people make good choices on their own? Like many decision analysts, I was first attracted to the science of decision making as a result of reading about the errors and biases that affect people's judgments. If you're not already familiar with the major results from this fascinating area, this introduction will help you to appreciate the value of formal decision-aiding tools. Errors and bias in judgment is the first reason that organizations choose the wrong projects.

The fact that people's intuitive decisions are often strongly and systematically biased has been firmly established over the past 30 years by literally hundreds of empirical studies. Psychologist Daniel Kahneman received this year's Nobel Prize in Economics for his work in this area. The conclusion reached by Kahneman and his colleagues is that people use unconscious shortcuts, termed heuristics, to cope with complex decisions. "In general, these heuristics are useful, but sometimes they lead to severe and systematic errors".[1]

Understanding heuristics and the errors they cause is important because it helps us find ways to counteract them. For example, when judging distance people use a heuristic that equates clarity with proximity. The clearer an object, the closer we perceive it to be. Although this heuristic is usually correct, it allows smog to trick us into thinking that objects are more distant than they are. This distortion poses dangers for airlines, so pilots are trained to rely more on instruments than on what they think they see out the cockpit window.

Some of the dozens of well-documented heuristics and related errors and biases include:

Comfort Zone Biases Perception Biases Motivation Biases Errors in Reasoning Groupthink
People tend to do what's comfortable rather than what's important. People's beliefs are distorted by faulty perceptions. People's motivations and incentives tend to bias their judgments. People use flawed reasoning to reach incorrect conclusions. Group dynamics add additional distortions.
People:
• Become attached to the status quo.
• Value things more highly if they already own them.
• Ignore information inconsistent with their current beliefs.
• Fail to learn and correct their beliefs despite strong evidence that they should do so.
• Keep doing the same things, even if they no longer work well.
• Distort their views of reality in order to feel more comfortable.
People:
• Anchor on information that is readily available, vivid or recent.
• Make insufficient adjustments from their initial anchors.
• Ascribe more credibility to data than is warranted.
• Overestimate what they know.
• Underestimate the effort involved to complete a difficult task.
• Give different answers to the same question posed in different ways.
People:
• Unconsciously distort judgments to "look good" and "get ahead."
• Take actions as if concerned only with short-term consequences.
• Attribute good decisions to skill, bad outcomes to others' failures or bad luck.
• Escalate commitments to avoid questioning earlier decisions.
• Favor actions that shield them from potentially unfavorable feedback.
People:
• Simplify inappropriately.
• Are persuaded by circular reasoning, false analogies, and other fallacious arguments.
• Are surprised by statistically likely "coincidences."
• Base the credibility of an argument on its manner of presentation.
• Abhor risk but seek bigger risks to avoid a sure loss.
• Cannot solve even simple probability problems in their heads.
Groups:
• "Dive in" without having all of the necessary information.
• Are excessively cautious in sharing data.
• Avoid expressing opposing views.
• Jump to conclusions prematurely or get bogged down trying to reach agreement.
• Create illusions of invulnerability and ignore external views of the morality of their actions.

The following is a summary of some of the most important biases, with some ideas for reducing their impact.

Introduction  Introduction

1. Tversky and D. Kahneman, Judgment Under Uncertainty: Heuristics and Biases, Cambridge Press, 1987.
Home | Issacons | PM Glossary | Papers & Books | Max's Musings
Guest Articles | Contact Info | Search My Site | Site Map | Top of Page