We have found that people draw on a large set of abilities that are sources of power. The conventional sources of power include deductive logical thinking, analysis of probabilities, and statistical methods. Yet the sources of power that are needed in natural settings are usually not analytical at all--the power of intuition, mental simulation, metaphor, and storytelling.
Features that help define a naturalistic decision-making setting are time pressure, high stakes, experienced decision makers, inadequate information (information that is missing, ambiguous, or erroneous), ill-defined goals, poorly defined procedures, cue learning, context (e.g., higher-level goals, stress), dynamic conditions, and team coordination.
The commanders' secret was that their experience let them see a situation, even a nonroutine one, as an example of a prototype, so they knew the typical course of action right away. Their experience let them identify a reasonable reaction as the first one they considered, so they did not bother thinking of others. They were not being perverse. They were being skillful. We now call this strategy recognition-primed decision making.
The decision makers looked at several options yet never compared any two of them. He thought of the options one at a time, evaluated each in turn, rejected it, and turned to the next most typical rescue technique. We can call this strategy a singular evaluation approach, to distinguish it from comparative evaluation. Singular evaluation means evaluating each option on its own merits, even if we cycle through several possibilities.
Satisficing: selecting the first option that works. Satisficing is different from optimizing, which means trying to come up with the best strategy. Optimizing is hard, and it takes a long time. Satisficing is more efficient. The singular evaluation strategy is based on satisficing.
Before we did this study, we believed that novices impulsively jumped at the first option they could think of, whereas experts carefully deliberated about the merits of different courses of action. Now it seemed that it was the experts who could generate a single course of action, while novices needed to compare different approaches. There are times for deliberating about options. Usually these are times when experience is inadequate and logical thinking is a substitute for recognizing a situation as typical.
DEFINING THE RECOGNITION-PRIMED DECISION MODEL
The recognition-primed decision (RPD) model fuses two processes: the way decision makers size up the situation to recognize which course of action makes sense, and the way they evaluate that course of action by imagining it.
One application is to be skeptical of courses in formal methods of decision making. They are teaching methods people seldom use.
A second application is to be sensitive to when you need to compare options and when you do not. For many tasks, we are novices, and the rational choice method helps us when we lack the expertise to recognize situations. Sometimes we may need to use formal methods to look at a wide array of alternatives. Other times we may judge that we should rely on our expertise to look in greater depth at a smaller set of alternatives-maybe the first one considered.
One final application involves training. The ideas set forth in this chapter imply that we do not make someone an expert through training in formal methods of analysis. Quite the contrary is true, in fact: we run the risk of slowing the development of skills. If the purpose is to train people in time-pressured decision making, we might require that the trainee make rapid responses rather than ponder all the implications. If we can present many situations an hour, several hours a day, for days or weeks, we should be able to improve the trainee's ability to detect familiar patterns. The design of the scenarios is critical, since the goal is to show many common cases to facilitate a recognition of typicality along with different types of rare cases so trainees will be prepared for these as well.
We can summarize the key features of the RPD model in comparison to the standard advice given to decision makers. The RPD model claims that with experienced decision makers:
The focus is on the way they assess the situation and judge it familiar, not on comparing options.
Courses of action can be quickly evaluated by imagining how they will be carried out, not by formal analysis and comparison.
Decision makers usually look for the first workable option they can find, not the best option.
Since the first option they consider is usually workable, they do not have to generate a large set of options to be sure they get a good one.
They generate and evaluate options one at a time and do not bother comparing the advantages and disadvantages of alternatives.
By imagining the option being carried out, they can spot weaknesses and find ways to avoid these, thereby making the option stronger. Conventional models just select the best, without seeing how it can be improved.
The emphasis is on being poised to act rather than being paralyzed until all the evaluations have been completed.
Intuition depends on the use of experience to recognize key patterns that indicate the dynamics of the situation. This is one basis for what we call intuition: recognizing things without knowing how we do the recognizing.
The part of intuition that involves pattern matching and recognition of familiar and typical cases can be trained. If you want people to size up situations quickly and accurately, you need to expand their experience base. One way is to arrange for a person to receive more difficult cases. Another approach is to develop a training program, perhaps with exercises and realistic scenarios, so the person has a chance to size up numerous situations very quickly. A good simulation can sometimes provide more training value than direct experience. A good simulation lets you stop the action, back up to see what went on, and cram many trials together so a person can develop a sense of typicality. Another training strategy is to compile stories of difficult cases and make these the training materials.
This is the premortem exercise: the use of mental simulation to find the flaws in a plan. Our exercise is to ask planners to imagine that it is months into the future and that their plan has been carried out. And it has failed. That is all they know; they have to explain why they think it failed. They have to look for the causes that would make them say, "Of course, it wasn't going to work, because ..." The idea is that they are breaking the emotional attachment to the plan's success by taking on the challenge of showing their creativity and competence by identifying likely sources of breakdown.
Mental simulation lets us explain how events have moved from the past into the present.
Mental simulation lets us project how the present will move into the future.
Constructing a mental simulation involves forming an action sequence in which one state of affairs is transformed into another.
Because of memory limitations, people usually construct mental simulations using around three variables and around six transitions.
It takes a fair amount of experience to construct a useful mental simulation.
Mental simulations can run into trouble when the situation becomes too complicated or when time pressure, noise, or other factors interfere.
Mental simulation can be misleading when a person argues away evidence that challenges the interpretation.
There are methods for improving mental simulations, such as using crystal ball and premortem strategies and decision scenarios.
Even when decision makers are comparing options and trying to find the best one, they may not be using rational choice strategies such as assessing each option on a common set of criteria. The process may be more like running a mental simulation of each course of action and comparing the emotional reactions--the discomfort or worry or enthusiasm-that each option produces when it is imagined. De Groot's (1946) study of chess players shows this. The chess grand masters were trying to find the best move in a position, yet they were not comparing the options on a common set of criteria (e.g., center control, defensive security). They were using progressive deepening to imagine how the option would be developed, and forming a judgment and emotional reaction to these potential outcomes.
The standard advice for making better decisions is to identify all the relevant options, define all the important evaluation criteria, weight the importance of each evaluation criterion, evaluate each option on each criterion, tabulate the results, and select the winner. In one form or another, this paradigm finds its way into training programs the world over. Again and again, the message is repeated: careful analysis is good, incomplete analysis is bad. And again and again, the message is ignored; trainees listen dutifully, then go out of the classes and act on the first option they think of. The reasons are clear. First, the rigorous, analytical approach cannot be used in most natural settings. Second, the recognitional strategies that take advantage of experience are generally successful, not as a substitute for the analytical methods, but as an improvement on them. The analytical methods are not the ideal; they are the fallback for those without enough experience to know what to do.
Consider which decisions are worth making. When options are very close together in value, we can call this a zone of indifference: the closer together the advantages and disadvantages of competing options, the harder it will be to make a decision but the less it will matter. For these situations, it is probably a waste of time to try to make the best decision. If we can sense that we are within this zone of indifference, we should make the choice any way we can and move on to other matters.
There is no reason to teach someone to follow the RPD model, since the model is descriptive. It shows what experienced decision makers already do.
Improve decision skills. Because the key to effective decision making is to build up expertise, one temptation is to develop training to teach people to think like experts. But in most settings, this can be too time-consuming and expensive. However, if we cannot teach people to think like experts, perhaps we can teach them to learn like experts. After reviewing the literature, I identified a number of ways that experts in different fields learn Klein (1997):
They engage in deliberate practice, so that each opportunity for practice has a goal and evaluation criteria.
They compile an extensive experience bank.
They obtain feedback that is accurate, diagnostic, and reasonably timely.
They enrich their experiences by reviewing prior experiences to derive new insights and lessons from mistakes.
There are many things experts can see that are invisible to everyone else:
Patterns that novices do not notice.
Anomalies-events that did not happen and other violations of expectancies.
The big picture (situation awareness).
The way things work.
Opportunities and improvisations.
Events that either already happened (the past) or are going to happen (the future).
Differences that are too small for novices to detect.
Their own limitations.
Experts perceive a situation as the patterns and relationships that grew out of the past and will grow into the future, not just the cues that exist at the moment. All these are perceived at the same time; all are part of their situation awareness.
Experts can perceive things that are invisible to novices: fine discriminations, patterns, alternate perspectives, missing events, the past and the future, and the process of managing decision-making activities.
Skilled chess players show high-quality moves, even under extreme time pressure, and high-quality moves as the first ones they consider.
Training to high-skill levels should emphasize perceptual skills, along with mastery of procedures.
The method we have found most powerful for eliciting knowledge is to use stories. If you ask experts what makes them so good, they are likely to give general answers that do not reveal much. But if you can get them to tell you about tough cases, nonroutine events where their skills made the difference, then you have a pathway into their perspective, into the way they are seeing the world. We call this the critical decision method, because it focuses attention on the key judgments and decisions that were made during the incident being described.
WORKING WITH OTHERS
When you communicate intent, you are letting the other team members operate more independently and improvise as necessary. You are giving them a basis for reading your mind more accurately.
There are seven types of information that a person could present to help the people receiving the request to understand what to do:
The purpose of the task (the higher-level goals).
The objective of the task (an image of the desired outcome).
The sequence of steps in the plan.
The rationale for the plan.
The key decisions that may have to be made.
Antigoals (unwanted outcomes).
Constraints and other considerations.
All seven types of information are not always necessary. Instead, this list can be used as a checklist, to determine if there are any more details to add.
If individuals can form a delusion of rational control over their actions, we should not be surprised to find teams doing the same thing.
The next time you are in a group or take part in a team discussion, consider the possibility that you may be seeing the way your own mind works from the inside. The chaos, the accidents, the inhibited thoughts, the chance connections, the serendipity: that is what is going on inside your own head. You do not realize it because we cannot bring to consciousness all the fluctuations in our brains. We cannot become aware of the thoughts we suppress. Our thinking usually looks so orderly, so purposive, so clean. Watching a team think is perhaps the closest we will get to being inside a mind.
Kenneth Tynan, the British essayist, producer, and playwright, described some advice he had been given. "Never take the anti-intellectual side in an argument. You'll find that most of the people who applaud you will be people you hate".
RATIONAL THINKING AND ANALYSIS
Rational comes from the Latin root ratio, which means "to reckon." To think by reckoning, or calculating, we need to do the following things:
Decompose. We have to analyze a task-break the task, idea, or argument into small units, basic elements, so we can perform different calculations on them. Seeing how to break something into its components is a source of power in its own right.
Decontextualize. Since context adds ambiguity, we must try to find units that are independent of context. We want to represent the important parts of context as additional facts and rules and elements. To accomplish this, we try to find a formal way to represent the world, to treat it as a representation, a picture, a model. We try to build theories and maps to substitute for having a sense of the task or the equipment.
Calculate. We apply a range of formal procedures on the elements, such as deductive rules of logic and statistical analyses.
Describe. All the analyses and representations should be open to public scrutiny.
Rational analysis is a source of power with strengths and weaknesses.
Hyperrationality is the attempt to apply deductive and statistical reasoning and analyses to situations where they do not apply.
Hyperrationality runs into difficulty for a number of reasons:
There are no basic elements.
Rules are ambiguous.
Setting up the calculations requires subjective judgments.
Formal analyses can degenerate into combinational explosions.
Trying to conduct a formal analysis can interfere with nonrational forms of thinking.
The features of natural settings usually prevent formal analysis.
Consistency is rarely ensured in natural settings.
DEALING WITH UNCERTAINTY
Highly successful military commanders seem to appreciate the vagaries of chance and do not waste time worrying about details that will not matter. The inference we draw is that although uncertainty is and will be inevitable, it is possible to maintain effective decision making in the face of it.
Our lives are just as governed by superstitions as those of less advanced cultures. The content of the superstitions has changed but not the degree to which they control us. The reason is that for many important aspects of our lives, we cannot pin down the causal relationships. We must act on faith, rumor, and precedent.
Jim Shanteau (1992) has suggested that we will not build up real expertise when:
The domain is dynamic.
We have to predict human behavior.
We have less chance for feedback.
The task does not have enough repetition to build a sense of typicality.
We have fewer trials.
Under these conditions, we should be cautious about assuming that experience translates into expertise. In these sorts of domains, experience would give us smooth routines, showing that we had been doing the job for a while. Yet our expertise might not go much beyond these surface routines; we would not have a chance to develop reliable expertise.
One way to improve performance is to be more careful in considering alternate explanations and diagnoses for a situation. The de minimus error may arise from using mental simulation to explain away cues that are early warnings of a problem. One exercise to correct this tendency is to use the crystal ball technique discussed earlier. The idea is that you can look at the situation, pretend that a crystal ball has shown that your explanation is wrong, and try to come up with a different explanation. Each time you stretch for a new explanation, you are likely to consider more factors, more nuances. This should reduce fixation on a single explanation. The crystal ball method is not well suited for time-pressured conditions. By practicing with it when we have the time, we may learn what it feels like to fixate on a hypothesis. This judgment may help us in situations of time pressure.The de minimus explanation is not the same as confirmation bias. With the de minimus explanation, the person is aware of disconfirming evidence and may even seek out such evidence but then explains it away. With the confirmation bias, the person chooses to seek confirming evidence that has little diagnostic value (it does not help distinguish between hypotheses) and does not try to obtain diagnostic evidence that might disconfirm the favored hypothesis.
A second application is to accept all errors as inevitable. In complex situations, no amount of effort is going to be able to prevent any errors.
Decision biases do not seem to explain poor decisions.
Stress does not result in faulty decision-making strategies but may limit the information we can consider in making the decisions.
Most poor decisions may result from having inadequate knowledge and expertise.
Experience does not translate directly into expertise if the domain is dynamic, feedback is inadequate, and the number and variety of experiences is too small.
Expertise depends on perceptual skills. You rarely get someone to jump a skill level by teaching more facts and rules. We can make training more efficient but cannot radically replace the accumulation of experiences.
The computer metaphor of thinking is incomplete. Mechanistic descriptions of skilled problem solving and decision making emphasize the storage, retrieval, and manipulation of data elements. This is one aspect of expertise, and certainly it is relevant to some tasks. But there are other aspects that are important.
Skilled problem solvers and decision makers are themselves scientists and experimenters. They are actively searching for and using stories and analogues, personal as well as borrowed from others, to learn about the important causal factors in their lives.
Skilled problem solvers and decision makers are chameleons. They can simulate all types of events and processes in their heads. They simulate the thinking of other people with whom they come in contact.
The sources of power described in this book operate in ways that are not analytical.
They are generative, channeling the decision making from opportunity to opportunity rather than exhaustively filtering through all the permutations.
They enable the decision maker to redefine goals and also to search for ways to achieve existing goals.
They trade accuracy for speed and therefore allow errors.
They are ways of building a person's experience base. Experience can be codified as stories and analogues.
They can be used in context, with interactive causes.
The sources of power described in this book have limitations as well as strengths. There are additional sources of power, such as analysis and calculation, that break tasks down into abstract elements and perform operations on these elements. In many difficult tasks, we blend the different sources of power and integrate them to fit the needs of the situation. I hope that the crude distinction between analytical and nonanalytical will give way so that we can learn to make more interesting comparisons and connections between the different sources of power.