Thinking in Systems: A Primer - by Donella H. Meadows

A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time. The system may be buffeted, constricted, triggered, or driven by outside forces. But the system’s response to these forces is characteristic of itself, and that response is seldom simple in the real world.

PART ONE: System Structure and Behavior

A system isn’t just any old collection of things. A system is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.

A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.

How to know whether you are looking at a system or just a bunch of stuff: A) Can you identify parts?... and B) Do the parts affect each other?... and C) Do the parts together produce an effect that is different from the effect of each part on its own?... and perhaps D) Does the effect, the behavior over time, persist in a variety of circumstances?

Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate.

If information-based relationships are hard to see, functions or purposes are even harder. A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system. The best way to deduce the system’s purpose is to watch for a while to see how the system behaves. If a frog turns right and catches a fly, and then turns left and catches a fly, and then turns around backward and catches a fly, the purpose of the frog has to do not with turning left or right or backward but with catching flies. If a government proclaims its interest in protecting the environment but allocates little money or effort toward that goal, environmental protection is not, in fact, the government’s purpose. Purposes are deduced from behavior, not from rhetoric or stated goals.

You can understand the relative importance of a system’s elements, interconnections, and purposes by imagining them changed one by one. Changing elements usually has the least effect on the system.

A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements—as long as its interconnections and purposes remain intact.

To ask whether elements, interconnections, or purposes are most important in a system is to ask an unsystemic question. All are essential. All interact. All have their roles. But the least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior. Interconnections are also critically important. Changing relationships usually changes system behavior. The elements, the parts of systems we are most likely to notice, are often (not always) least important in defining the unique characteristics of the system—unless changing an element also results in changing relationships or purpose.

A stock is the foundation of any system. Stocks are the elements of the system that you can see, feel, count, or measure at any given time. A system stock is just what it sounds like: a store, a quantity, an accumulation of material or information that has built up over time. It may be the water in a bathtub, a population, the books in a bookstore, the wood in a tree, the money in a bank, your own self-confidence. A stock does not have to be physical. Your reserve of good will toward others or your supply of hope that the world can be better are both stocks.

Stocks change over time through the actions of a flow. Flows are filling and draining, births and deaths, purchases and sales, growth and decay, deposits and withdrawals, successes and failures. A stock, then, is the present memory of the history of changing flows within the system.

If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems. And if you have had much experience with a bathtub, you understand the dynamics of stocks and flows.

A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate. There’s more than one way to fill a bathtub!

Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems.

Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.

A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.

Balancing feedback loops are goal-seeking or stability-seeking. Each tries to keep a stock at a given value or within a range of values. A balancing feedback loop opposes whatever direction of change is imposed on the system. If you push a stock too far up, a balancing loop will try to pull it back down. If you shove it too far down, a balancing loop will try to bring it back up.

Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.

Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.

One-Stock Systems

A Stock with Two Competing Balancing Loops—a Thermostat

The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.

A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.

A Stock with One Reinforcing Loop and One Balancing Loop—Population and Industrial Economy

Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.

System dynamics models explore possible futures and ask “what if” questions.

Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.

Systems with similar feedback structures produce similar dynamic behaviors.

A System with Delays—Business Inventory

A delay in a balancing feedback loop makes a system likely to oscillate.

Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may (or may not, depending on the type of delay and the relative lengths of other delays) make a large change in the behavior of a system.

Two-Stock Systems

A Renewable Stock Constrained by a Nonrenewable Stock—an Oil Economy

In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.

A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.

Renewable Stock Constrained by a Renewable Stock—a Fishing Economy

Nonrenewable resources are stock-limited. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource. Renewable resources are flowlimited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.

PART TWO: Systems and Us

Why Systems Work So Well

Resilience: Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity. Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation. A single balancing loop brings a system stock back to its desired state. Resilience is provided by several such loops, operating through different mechanisms, at different time scales, and with redundancy—one kicking in if another one fails. There are always limits to resilience. Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves.

Self-Organization: Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not.

Hierarchy: In the process of creating new structures and increasing complexity, one thing that a self-organizing system often generates is hierarchy. Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers. When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.

Why Systems Surprise Us

Everything we think we know about the world is a model. Our models do have a strong congruence with the world. Our models fall far short of representing the real world fully.

System structure is the source of system behavior. System behavior reveals itself as a series of events over time. That’s one reason why systems of all kinds surprise us. We are too fascinated by the events they generate. We pay too little attention to their history. And we are insufficiently skilled at seeing in their history clues to the structures from which behavior and events flow.

Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.

There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.

At any given time, the input that is most important to a system is the one that is most limiting.

Any physical entity with multiple inputs and outputs is surrounded by layers of limits.

There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.

When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.

The bounded rationality of each actor in a system—determined by the information, incentives, disincentives, goals, stresses, and constraints impinging on that actor—may or may not lead to decisions that further the welfare of the system as a whole. If they do not, putting new actors into the same system will not improve the system’s performance. What makes a difference is redesigning the system to improve the information, incentives, disincentives, goals, stresses, and constraints that have an effect on specific actors.

System Traps... and Opportunities

Some systems are more than surprising. They are perverse. These are the systems that are structured in ways that produce truly problematic behavior; they cause us great trouble. There are many forms of systems trouble, some of them unique, but many strikingly common. We call the system structures that produce such common patterns of problematic behavior archetypes.

THE TRAP: POLICY RESISTANCE
When various actors try to pull a system stock toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the stock farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining.
THE WAY OUT
Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized—or redefinitions of larger and more important goals that everyone can pull toward together.

THE TRAP: TRAGEDY OF THE COMMONS
When there is a commonly shared resource, every user benefits directly from its use, but shares the costs of its abuse with everyone else. Therefore, there is very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is overuse of the resource, eroding it until it becomes unavailable to anyone.
THE WAY OUT
Educate and exhort the users, so they understand the consequences of abusing the resource. And also restore or strengthen the missing feedback link, either by privatizing the resource so each user feels the direct consequences of its abuse or (since many resources cannot be privatized) by regulating the access of all users to the resource.

THE TRAP: DRIFT TO LOW PERFORMANCE
Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.
THE WAY OUT
Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!

THE TRAP: ESCALATION
When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever.
THE WAY OUT
The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation.

THE TRAP: SUCCESS TO THE SUCCESSFUL
If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.
THE WAY OUT
Diversification, which allows those who are losing the competition to get out of that game and start another one; strict limitation on the fraction of the pie any one winner may win (antitrust laws); policies that level the playing field, removing some of the advantage of the strongest players or increasing the advantage of the weakest; policies that devise rewards for success that do not bias the next round of competition.

THE TRAP: SHIFTING THE BURDEN TO THE INTERVENOR
Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem. If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in motion. The system deteriorates; more and more of the solution is then required. The system will become more and more dependent on the intervention and less and less able to maintain its own desired state.
THE WAY OUT
Again, the best way out of this trap is to avoid getting in. Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring. If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself. If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be.

THE TRAP: RULE BEATING
Rules to govern a system can lead to rule beating—perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.
THE WAY OUT
Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.

THE TRAP: SEEKING THE WRONG GOAL
System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.
THE WAY OUT
Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.

PART THREE: Creating Change—in Systems and in Our Philosophy

Places to Intervene in a System (in increasing order of effectiveness)

  • Numbers: Constants and parameters such as subsidies, taxes, and standards

  • Buffers: The sizes of stabilizing stocks relative to their flows

  • Stock-and-Flow Structures: Physical systems and their nodes of intersection

  • Delays: The lengths of time relative to the rates of system changes

  • Balancing Feedback Loops: The strength of the feedbacks relative to the impacts they are trying to correct. Examples of strengthening balancing feedback controls to improve a system’s self-correcting abilities include: preventive medicine, exercise, and good nutrition to bolster the body’s ability to fight disease; integrated pest management to encourage natural predators of crop pests; the Freedom of Information Act to reduce government secrecy; monitoring systems to report on environmental damage; protection for whistleblowers.

  • Reinforcing Feedback Loops: The strength of the gain of driving loops. Look for leverage points around birth rates, interest rates, erosion rates, “success to the successful” loops, any place where the more you have of something, the more you have the possibility of having more.

  • Information Flows: The structure of who does and does not have access to information

  • Rules: Incentives, punishments, constraints

  • Self-Organization: The power to add, change, or evolve system structure. The intervention point here is obvious, but unpopular. Encouraging variability and experimentation and diversity means “losing control.” Let a thousand flowers bloom and anything could happen! Who wants that? Let’s play it safe and push this lever in the wrong direction by wiping out biological, cultural, social, and market diversity!

  • Goals: The purpose of the system

  • Paradigms: The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises. How do you change paradigms? You keep pointing at the anomalies and failures in the old paradigm. You keep speaking and acting, loudly and with assurance, from the new one. You insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather, you work with active change agents and with the vast middle ground of people who are open-minded. Systems modelers say that we change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. I say that because my own paradigms have been changed that way.

  • Transcending Paradigms: People who cling to paradigms (which means just about all of us) take one look at the spacious possibility that everything they think is guaranteed to be nonsense and pedal rapidly in the opposite direction. Surely there is no power, no control, no understanding, not even a reason for being, much less acting, embodied in the notion that there is no certainty in any worldview. But, in fact, everyone who has managed to entertain that idea, for a moment or for a lifetime, has found it to be the basis for radical empowerment. If no paradigm is right, you can choose whatever one will help to achieve your purpose. If you have no idea where to get a purpose, you can listen to the universe.

Living in a World of Systems

We can’t control systems or figure them out. But we can dance with them!

Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity—our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality. Here, as a start-off dancing lesson, are the practices I see my colleagues adopting, consciously or unconsciously, as they encounter new systems.

Get the Beat of the System: Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system—peoples’ memories are not always reliable when it comes to timing.

Expose Your Mental Models to the Light of Day: When we draw structural diagrams and then write equations, we are forced to make our assumptions visible and to express them with rigor. We have to put every one of our assumptions about the system out where others (and we ourselves) can see them. Our models have to be complete, and they have to add up, and they have to be consistent. Our assumptions can no longer slide around (mental models are very slippery), assuming one thing for purposes of one discussion and something else contradictory for purposes of the next discussion. You don’t have to put forth your mental model with diagrams and equations, although doing so is a good practice. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility—the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure—is a necessity when you live in a world of flexible systems.

Honor, Respect, and Distribute Information: I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information. If I could, I would add an eleventh commandment to the first ten: Thou shalt not distort, delay, or withhold information. You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.

Use Language with Care and Enrich It with Systems Concepts: The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible—part of the job of keeping information streams clear. The second step is to enlarge language to make it consistent with our enlarged understanding of systems. If the Eskimos have so many words for snow, it’s because they have studied and learned how to use snow. They have turned snow into a resource, a system with which they can dance. The industrial society is just beginning to have and use words for systems, because it is only beginning to pay attention to and use complexity. Carrying capacity, structure, diversity, and even system are old words that are coming to have richer and more precise meanings. New words are having to be invented.

Pay Attention to What Is Important, Not Just What Is Quantifiable: Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models. You’ve already seen the system trap that comes from setting goals around what is easily measured, rather than around what is important. So don’t fall into that trap. Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector. Be a walking, noisy Geiger counter that registers the presence or absence of quality. If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don’t let it pass. Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy.

Make Feedback Policies for Feedback Systems: You can imagine why a dynamic, self-adjusting feedback system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops—loops that alter, correct, and expand loops. These are policies that design learning into the management process.

Go for the Good of the Whole: Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole. Don’t, as Kenneth Boulding once said, go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability—whether they are easily measured or not.

Listen to the Wisdom of the System: Aid and encourage the forces and structures that help the system run itself. Notice how many of those forces and structures are at the bottom of the hierarchy. Don’t be an unthinking intervenor and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.

Locate Responsibility in the System: That’s a guideline both for analysis and design. In analysis, it means looking for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease). But sometimes they can’t. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system. “Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers.

Stay Humble— Stay a Learner: Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don’t know. The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.

Celebrate Complexity: Let’s face it, the universe is messy. It is nonlinear, turbulent, and dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity and uniformity. That’s what makes the world interesting, that’s what makes it beautiful, and that’s what makes it work. We can, and some of us do, celebrate and encourage self-organization, disorder, variety, and diversity.

Expand Time Horizons: When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term—the whole system.

Defy the Disciplines: Seeing systems whole requires more than being “interdisciplinary,” if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other. Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode. They will have to admit ignorance and be willing to be taught, by each other and by the system. It can be done. It’s very exciting when it happens.

Expand the Boundary of Caring: Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all, it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, then systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem.

Don’t Erode the Goal of Goodness: The most damaging example of the systems archetype called “drift to low performance” is the process by which modern industrial culture has eroded the goal of morality. The workings of the trap have been classic, and awful to behold. Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. This is just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are “not news.” They are exceptions. Must have been a saint. Can’t expect everyone to behave like that. And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love. We know what to do about drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute.