Book cover.

Title of book: Thinking in Systems: A Primer

Author: Donella H. Meadows

Amazon link, Goodreads link

Summary Author: Massimo Curatella

Summary of the book in a tweet

A pillar in the field. Why Systems Thinking is the more efficient and effective way to think. Definition of a system: components, relevant behaviors, archetypes, mindsets, and models. How to face the most diffuse systems traps and, of great elegance and power: how to intervene in a system.

Top 3 takeaways

  1. A system is more than the sum of its parts and its structure is the source of system behavior which is often determined by its function or purpose There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion.
  2. Observe stocks and flows and reveal the behavior of a system over time. System behavior is a set of feedback loops. The strict interaction between components through feedback loops define the system behavior and reveal the purpose or the function of a system
  3. Learn to recognize patterns in order to leverage on the possible change you can promote or facilitate into a system. Forget about control: dance with systems

Who is the intended audience?

Anybody who is part of a system, that is: everybody.

Top 3 positive points

  • Well-written, logically organized, rich of examples.
  • Inspiring, eye-opening, mind-shifting.
  • Treated topics are widely applicable to any field of knowledge.

Top 3 negative points

  • Stock-and-flow diagrams are not intuitive to understand as snapshots.
  • The limitations are in the static medium of the book.
  • The best way to understand the dynamics of a system is through animation or better interactive visual representations.
  • Although of great utility and richness the “Systems Zoo” is covering only a few species.

Misc. comments

Thanks to this book, now, I want to read all the others from the same author. It is a great loss for humanity for not having Donella Meadows on this Earth, anymore.


SUMMARY

Systems

  • A system is more than the sum of its parts.
  • Many of the interconnections in systems operate through the flow of information.
  • The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.
  • System structure is the source of system behavior. System behavior reveals itself as a series of events over time.

Stocks, Flows, and Dynamic Equilibrium

  • A stock is the memory of the history of changing flows within the system.
  • If the sum of inflows exceeds the sum of outflows, the stock level will rise.
  • If the sum of outflows exceeds the sum of inflows, the stock level will fall.
  • If the sum of outflows equals the sum of inflows, the stock level will not change — it will be held in dynamic equilibrium.
  • A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate.
  • Stocks act as delays or buffers or shock absorbers in systems.
  • Stocks allow inflows and outflows to be de-coupled and independent.

Feedback Loops

  • A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.
  • Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
  • Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time.
  • The information delivered by a feedback loop—even nonphysical feedback—can affect only future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback.
  • A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
  • Systems with similar feedback structures produce similar dynamic behaviors.

Shifting Dominance, Delays, and Oscillations

  • Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
  • A delay in a balancing feedback loop makes a system likely to oscillate.
  • Changing the length of a delay may make a large change in the behavior of a system.

Scenarios and Testing Models

  • System dynamics models explore possible futures and ask “what if” questions.
  • Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.

Constraints on Systems

  • In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no system can grow forever in a finite environment.
  • Nonrenewable resources are stock-limited.
  • Renewable resources are flow-limited.

Resilience, Self-Organization, and Hierarchy

  • There are always limits to resilience.
  • Systems need to be managed not only for productivity or stability, they also need to be managed for resilience.
  • Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify.
  • Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.

Source of System Surprises

  • Many relationships in systems are nonlinear.
  • There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion.
  • At any given time, the input that is most important to a system is the one that is most limiting.
  • Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
  • There always will be limits to growth.
  • A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time.
  • When there are long delays in feedback loops, some sort of foresight is essential.
  • The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.

Mindsets and Models

  • Everything we think we know about the world is a model.
  • Our models do have a strong congruence with the world.
  • Our models fall far short of representing the real world fully.

Springing the System Traps

Policy Resistance

Trap: When various actors try to pull a system state toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the system state farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining.

The Way Out: Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized—or redefinitions of larger and more important goals that everyone can pull toward together.

The Tragedy of the Commons

Trap: When there is a commonly shared resource, every user benefits directly from its use, but shares the costs of its abuse with everyone else. Therefore, there is very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is overuse of the resource, eroding it until it becomes unavailable to anyone.

The Way Out: Educate and exhort the users, so they understand the consequences of abusing the resource. And also restore or strengthen the missing feedback link, either by privatizing the resource so each user feels the direct consequences of its abuse or (since many resources cannot be privatized) by regulating the access of all users to the resource.

Drift to Low Performance

Trap: Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.

The Way Out: Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Set up a drift toward high performance!

Escalation

Trap: When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever.

The Way Out: The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation.

Success to the Successful

Trap: If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.

The Way Out: Diversification, which allows those who are losing the competition to get out of that game and start another one; strict limitation on the fraction of the pie any one winner may win (antitrust laws); policies that level the playing field, removing some of the advantage of the strongest players or increasing the advantage of the weakest; policies that devise rewards for success that do not bias the next round of competition.

Shifting the Burden to the Intervenor

Trap: Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem.If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in motion. The system deteriorates; more and more of the solution is then required. The system will become more and more dependent on the intervention and less and less able to maintain its own desired state.

The Way Out: Again, the best way out of this trap is to avoid getting in. Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring. If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself. If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be.

Rule Beating

Trap: Rules to govern a system can lead to rule-beating—perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.

The Way Out: Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.

Seeking the Wrong Goal

Trap: System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.

The Way Out: Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.

Places to Intervene in a System (in increasing order of effectiveness)

  1. Numbers: Constants and parameters such as subsidies, taxes, and standards
  2. Buffers: The sizes of stabilizing stocks relative to their flows
  3. Stock-and-Flow Structures: Physical systems and their nodes of intersection
  4. Delays: The lengths of time relative to the rates of system changes
  5. Balancing Feedback Loops: The strength of the feedback relative to the impacts they are trying to correct
  6. Reinforcing Feedback Loops: The strength of the gain of driving loops
  7. Information Flows: The structure of who does and does not have access to information
  8. Rules: Incentives, punishments, constraints
  9. Self-Organization: The power to add, change, or evolve system structure
  10. Goals: The purpose of the system
  11. Paradigms: The mindset out of which the system—its goals, structure, rules, delays, parameters—arises
  12. Transcending Paradigms

Guidelines for Living in a World of Systems

  1. Get the beat of the system.
  2. Expose your mental models to the light of day.
  3. Honor, respect, and distribute information.
  4. Use language with care and enrich it with systems concepts.
  5. Pay attention to what is important, not just what is quantifiable.
  6. Make feedback policies for feedback systems.
  7. Go for the good of the whole.
  8. Listen to the wisdom of the system.
  9. Locate responsibility within the system.
  10. Stay humble—stay a learner.
  11. Celebrate complexity.
  12. Expand time horizons.
  13. Defy the disciplines.
  14. Expand the boundary of caring.
  15. Don’t erode the goal of goodness.

NOTES

A note from the editor

Systems thinking is a critical tool in addressing the many environmental, political, social, and economic challenges we face around the world. Systems, big or small, can behave in similar ways, and understanding those ways is perhaps our best hope for making a lasting change on many levels. Once you start to see the events of the day as parts of trends, and those trends as symptoms of underlying system structure, you will be able to consider new ways to manage and new ways to live in a world of complex systems. It is a book for those who want to shape a better future.

Introduction: The Systems Lens

A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time.

On the one hand, we have been taught to analyze, to use our rational ability, to trace direct paths from cause to the world around us. On the other hand, long before we were educated in rational analysis, we all dealt with complex systems.

A diverse system with multiple pathways and redundancies is more stable and less vulnerable to external shock than a uniform system with little diversity.— Don’t put all your eggs in one basket.

Ever since the Industrial Revolution, Western society has benefited from science, logic, and reductionism over intuition and holism.

Because they are embedded in larger systems, however, some of our “solutions” have created further problems. And some problems, those most rooted in the internal structure of complex systems, the real messes, have refused to go away.

That is because they are intrinsically systems problems—undesirable behaviors characteristic of the system structures that produce them.

Words and sentences must, by necessity, come only one at a time in a linear, logical order. Systems happen all at once.

Systems thinkers call these common structures that produce characteristic behaviors “archetypes”, which are responsible for some of the most intransigent and potentially dangerous problems, also can be transformed, with a little systems understanding, to produce much more desirable behaviors.

At a time when the world is messier, more crowded, more interconnected, more interdependent, and more rapidly changing than ever before, the more ways of seeing, the better.

The systems-thinking lens allows us to reclaim our intuition about the whole systems and

  • hone our abilities to understand parts,
  • see interconnections,
  • ask “what-if ” questions about possible future behaviors, and
  • be creative and courageous about system redesign. Then we can use our insights to make a difference in ourselves and our world.

The behavior of a system cannot be known just by knowing the elements of which the system is made.

Part One: System Structure and Behavior

ONE — The Basics

More Than the Sum of Its Parts

A system is an interconnected set of elements that is coherently organized in a way that achieves something.

A system must consist of three kinds of things:

  • Elements,
  • Interconnections,
  • and a function or purpose.

Is there anything that is not a system? Yes—a conglomeration without any particular interconnections or function.

A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.

Look Beyond the Players to the Rules of the Game

Once you start listing the elements of a system, there is almost no end to the process.

THINK ABOUT THIS:

  1. Can you identify parts? . . . and
  2. Do the parts affect each other? . . . and
  3. Do the parts together produce an effect that is different from the effect of each part on its own? . . . and perhaps
  4. Does the effect, the behavior over time, persist in a variety of circumstances?

Information holds systems together and plays a great role in determining how they operate.

A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system. Purposes are deduced from behavior, not from rhetoric or stated goals.

A NOTE ON LANGUAGE

The word function is generally used for a nonhuman system, the word purpose for a human one, but the distinction is not absolute since so many systems have both human and nonhuman elements.

System purposes need not be human purposes and are not necessarily those intended by any single actor within the system. These can be nested within systems. Therefore, there can be purposes within purposes.

Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.

You can understand the relative importance of a system’s elements, interconnections, and purposes by imagining them changed one by one.
A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements—as long as its interconnections and purposes remain intact.

The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.

Changing interconnections in a system can change it dramatically.

Changes in function or purpose also can be drastic.

A change in purpose changes a system profoundly, even if every element and interconnection remains the same.
To ask whether elements, interconnections, or purposes are most important in a system is to ask an unsystemic question. All are essential. All interact. All have their roles.
But the least obvious part of the system, its function or purpose, is often the most crucial determinant of defining the unique characteristics of the system— unless changing an element also results in changing relationships or purpose.

Bathtubs 101—Understanding System Behavior over Time

Storing information means increasing the complexity of the mechanism.

A system stock is just what it sounds like: a store, a quantity, an accumulation of material or information that has built up over time.

A stock is the memory of the history of changing flows within the system.
All system diagrams and descriptions are simplified versions of the real world.

If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.

A NOTE ON READING GRAPHS OF BEHAVIOR OVER TIME

Systems thinkers use graphs of system behavior to understand trends over time, rather than focusing attention on individual events.The pattern—the shape of the variable line—is important, as are the points at which that line changes shape or direction.

The horizontal axis of time allows you to ask questions about what came before, and what might happen next. It is in a state of dynamic equilibrium.

All models, whether mental models or mathematical models, are simplifications of the real world.

  • As long as the sum of all inflows exceeds the sum of all outflows, the level of the stock will rise.
  • As long as the sum of all outflows exceeds the sum of all inflows, the level of the stock will fall.
  • If the sum of all outflows equals the sum of all inflows, the stock level will not change; it will be held in dynamic equilibrium at whatever level it happened to be when the two sets of flows became equal.

The human mind seems to focus more easily on stocks than on flows. On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows.

A breakthrough in energy efficiency is equivalent, in its effect on the stock of available oil, to the discovery of a new oil field—although different people profit from it.

A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate. There’s more than one way to fill a bathtub!

A stock takes time to change, because flows take time to flow. Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems.

It has taken decades to accumulate the stratospheric pollutants that destroy the earth’s ozone layer; it will take decades for those pollutants to be removed.

Industrialization cannot proceed faster than the rate at which factories and machines can be constructed and the rate at which human beings can be educated to run and maintain them.

Forests can’t grow overnight.

The time lags imposed by stocks allow room to maneuver, to experiment, and to revise policies that aren’t working.

If you have a sense of the rates of change of stocks, you don’t expect things to happen faster than they can happen.

You don’t give up too soon. You can use the opportunities presented by a system’s momentum to guide it toward a good outcome—much as a judo expert uses the momentum of an opponent to achieve his or her own goals.

Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.

Most individual and institutional decisions are designed to regulate the levels in stocks.
Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows.

That means system thinkers see the world as a collection of “feedback processes.”

How the System Runs Itself— Feedback

It is the consistent behavior pattern over a long period of time that is the first hint of the existence of a feedback loop.

Not all systems have feedback loops.

A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.

Stabilizing Loops— Balancing Feedback

Remember—all system diagrams are simplifications of the real world .

We each choose how much complexity to look at.

This kind of stabilizing, goal-seeking, regulating loop is called a balancing feedback loop.

Balancing feedback loops are goal-seeking or stability-seeking, the feedback loop brings it toward the goal.

Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.

The presence of a feedback mechanism doesn’t necessarily mean that the mechanism works well.

Runaway Loops— Reinforcing Feedback

This is not a simple linear growth. It is not constant over time. The growth of the bank account at lower interest rates may look linear in the first few years. But, in fact, growth goes faster and faster. The more is there, the more is added.

This kind of growth is called “exponential.”

Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.

This reinforcing feedback loop is the central engine of growth in an economy.
Sometimes I challenge my students to try to think of any human decision that occurs without a feedback loop—that is, a decision that is made without regard to any information about the level of the stock it influences.

The most common “non-feedback” decisions students suggest are falling in love and committing suicide.

HINT ON REINFORCING LOOPS AND DOUBLING TIME

When someone tells you that population growth causes poverty, you’ll ask yourself how poverty may cause population growth.

THINK ABOUT THIS

You’ll be thinking not in terms of a static world, but a dynamic one. You’ll stop looking for who’s to blame; instead, you’ll start asking, “What’s the system?” The concept of feedback opens up the idea that a system can cause its own behavior.

TWO — A Brief Visit to the Systems Zoo

One-Stock Systems

A Stock with Two Competing Balancing Loops—a Thermostat

It’s like trying to keep a bucket full when there’s a hole in the bottom. To make things worse, water leaking out of the hole is governed by a feedback loop; the more water in the bucket, the more the water pressure at the hole increases, so the flow out increases!
With home heating systems, people have learned to set the thermostat slightly higher than the actual temperature they are aiming at.

The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct the behavior that drove the current feedback.

It means there will always be delays in responding.

Many economic models make a mistake in this matter by assuming that consumption or production can respond immediately, say, to a change in price. That’s one of the reasons why real economies tend not to behave exactly like many economic models.

Your mental model of the system needs to include all the important flows, or you will be surprised by the system’s behavior.

A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.

Every balancing feedback loop has its breakdown point, where other loops pull the stock away from its goal more strongly than it can pull back.

Stock with One Reinforcing Loop and One Balancing Loop—Population and Industrial Economy

What happens when a reinforcing and a balancing loop are both pulling on the same stock? This is one of the most common and important system structures. Among other things, it describes every living population and every economy.

Dominance is an important concept in systems thinking.

When one loop dominates another, it has a stronger impact on behavior. Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine the behavior.

Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.

Whenever you are confronted with a scenario (and you are, every time you hear about an economic prediction, a corporate budget, a weather forecast, future climate change, you decide how good a representation of reality is the underlying model.

  • Are the driving factors likely to unfold this way? (What are birth rate and death rate likely to do?)
  • If they did, would the system react this way? (Do birth and death rates really cause the population stock to behave as we think it will?)
  • What is driving the driving factors? (What affects birth rate? What affects the death rate?)

The first question can’t be answered factually. It’s a guess about the future, and the future is inherently uncertain. Although you may have a strong opinion about it, there’s no way to prove you’re right until the future actually happens.

A systems analysis can test a number of scenarios to see what happens if the driving factors do different things.

That’s usually one purpose of a systems analysis. But you have to be the judge of which scenario, if any, should be taken seriously as a future that might really be possible.

Dynamic systems studies usually are not designed to predict what will happen. Rather, they’re designed to explore what would happen, if a number of driving factors unfold in a range of different ways.

The second question—whether the system really will react this way—is more scientific. It’s a question about how good the model is. Does it capture the inherent dynamics of the system?

Regardless of whether you think the driving factors will do that, would the system behave like that if they did?

System dynamics models explore possible futures and ask “what if” questions.

QUESTIONS FOR TESTING THE VALUE OF A MODEL

  1. Are the driving factors likely to unfold this way?
  2. If they did, would the system react this way?
  3. What is driving the driving factors?

Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.

What is adjusting the inflows and outflows? This is a question about system boundaries. It requires a hard look at those driving factors to see if they are actually independent, orif they are also embedded in the system.

Any long term model of a real economy should link together the two structures of population and capital to show how they affect each other. The central question of economic development is how to keep the reinforcing loop of capital accumulation from growing more slowly than the reinforcing loop of population growth—so that people are getting richer instead of poorer.

Systems with similar feedback structures produce similar dynamic behaviors.

One of the central insights of systems theory, as central as the observation that systems largely cause their own behavior, is that systems with similar feedback structures produce similar dynamic behaviors, even if the outward appearance of these systems is completely dissimilar.

A System with Delays—Business Inventory

First, there is a perception delay.
Second, there is a response delay.
Third, there is a delivery delay.

Figure 31. Inventory at a car dealership with three common delays now included in the picture—a perception delay, a response delay, and a delivery delay.

There are ways to damp these oscillations in inventory.

A delay in a balancing feedback loop makes a system likely to oscillate.

Not much happens when the car dealer shortens her perception delay. If anything the oscillations in the inventory of cars on the lot are a bit worse.

And if, instead of shortening her perception time, the car dealer tries shortening her reaction time—making up perceived shortfalls in two days instead of three—things get very much worse.

This perverse kind of result can be seen all the time—someone trying to fix a system is attracted intuitively to a policy lever that in fact does have a strong effect on the system.

Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may (or may not, depending on the type of delay and the relative lengths of other delays) make a large change in the behavior of a system.

We can’t begin to understand the dynamic behavior of systems unless we know where and how long the delays are.

That very large system, with interconnected industries responding to each other through delays, entraining each other in their oscillations, and being amplified by multipliers and speculators, is the primary cause of business cycles.

Economies are extremely complex systems; they are full of balancing feedback loops with delays, and they are inherently oscillatory. (5) Jay W. Forrester, 1989.

Two-Stock Systems

Therefore, any physical, growing system is going to run into some kind of constraint, sooner or later.

In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.

A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.

In the face of exponential growth of extraction or use, a doubling or quadrupling of the nonrenewable resource give little added time to develop alternatives.

The real choice in the management of a nonrenewable resource is whether to get rich very fast or to get less rich but stay that way longer.

Unless, perhaps, the economy can learn to operate entirely from renewable resources.

Renewable Stock Constrained by a Renewable Stock—a Fishing Economy

Nonrenewable resources are stock -limited. Renewable resources are flow limited.

Three sets of possible behaviors of this renewable resource system here:

  • overshoot and adjustment to a sustainable equilibrium,
  • overshoot beyond that equilibrium followed by oscillation around it, and
  • overshoot followed by a collapse of the resource and the industry dependent on the resource.

Neither renewable nor nonrenewable limits to growth allow a physical stock to grow forever, but the constraints they impose are dynamically quite different. The difference comes because of the difference between stocks and flows.

The trick, as with all the behavioral possibilities of complex systems, is to recognize what structures contain which latent behaviors, and what conditions release those behaviors—and, where possible, to arrange the structures and conditions to reduce the probability of destructive behaviors and to encourage the possibility of beneficial ones.

Part Two: Systems and Us

THREE — Why Systems Work So Well

To keep every cog and wheel is the first precaution of intelligent tinkering.

Placing a system in a straitjacket of constancy can cause fragility to evolve.

—C. S. Holling, ecologist

Resilience: the ability to bounce or spring back into shape, position, etc., after being pressed or stretched. Elasticity. The ability to recover strength, spirits, good humor, or any other aspect quickly.” Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.

A set of feedback loops that can restore or rebuild feedback loops resilience at a still higher level—meta-resilience.

Even higher meta-meta- resilience comes from feedback loops that can learn, create, design, and evolve ever more complex restorative structures.

The human body is an astonishing example of a resilient system.

There are always limits to resilience.

Ecosystems are also remarkably resilient.

Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic and, conversely, systems that are constant over time can be unresilient.

Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.

Many chronic diseases, ecological disasters in many places come from the loss of resilience.

Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves.

Self-Organization

Evolution is governed by definite laws… The discovery of these laws constitutes one of the most important tasks of the future.

This capacity of a system to make its own structure more complex is called self-organization.

Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes.

Self-organization produces heterogeneity and unpredictability.

Koch snowflake. Fractal geometry—a realm of mathematics and art populated by elaborate shapes formed by relatively simple rules.

Similarly, the delicate, beautiful, intricate structure of a stylized fern can be generated by a computer with just a few simple fractal rules. The differentiation of a single cell into a human being probably proceeds by some similar set of geometric rules, basically simple, but generating utter complexity. (It is because of fractal geometry that the average human lung has enough surface area to cover a tennis court.)

Examples of simple organizing rules that have led to self-organizing systems of great complexity.

Systems often have the property of self-organization—the ability to structure themselves, to create new structures, to learn, diversify, and complexify. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not.

Hierarchy

Arrangement of systems and subsystems is called a hierarchy.

Complex systems can evolve from simple systems only if there are stable intermediate forms. Among all possible complex forms, hierarchies are the only ones that have had the time to evolve.

People whose thinking has not evolved as fast as the energy economy has may be shocked to discover how dependent they have become on resources and decisions halfway around the world.

Life started with single-cell bacteria, not with elephants.

If a team member is more interested in personal glory than in the team winning, he or she can cause the team to lose.

When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.

Economic examples of overcontrol from the top, from companies to nations, are the causes of some of the great catastrophes of history, all of which are by no means behind us.

To be a highly functional system, hierarchy must balance subsystems and total system—there must be enough central control to achieve coordination toward the large system goal, and enough autonomy to keep all subsystems flourishing, functioning, and self-organizing.

Resilience, self-organization, and hierarchy are three of the reasons dynamic systems can work so well. Promoting or managing for these properties of a system.

Promoting or managing for these properties of a system can improve its ability to function well over the long term—to be sustainable.

Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.

FOUR — Why Systems Surprise Us

  1. Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head–my mental models. None of these is or ever be the real world.
  2. Our models do have a strong congruence with the world. That is why we are such a successful species in the biosphere. Especially complex and sophisticated are the mental models we develop from direct, intimate experience of nature, people, and organizations immediately around us.
  3. However, and conversely, our models fall far short of representing the real world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions or logical conclusions from inaccurate assumptions. Most of us, for instance, are surprised by the amount of growth an exponential process can generate. Few of us can intuit how to damp oscillations in a complex system.

Our knowledge is amazing; our ignorance even more so.

You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure;unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays.

Beguiling Events

Like the tip of an iceberg rising above the water, events are the most visible aspect of a larger complex—but not always the most important.

The behavior of a system is its performance over time.

If the news did a better job of putting events into historical context, we would have better behavior-level understanding, which is deeper than event-level understanding.

When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system.

That’s because long term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.

System structure is the source of system behavior. System behavior reveals itself as a series of events over time.

Systems thinking goes back and forth constantly between structure (diagrams of stocks, flows, and feedback) and behavior (time graphs).

These explanations give you no ability to predict what will happen tomorrow. They give you no ability to change the behavior of the system.

Without seeing how stocks affect their related flows through feedback processes, one cannot understand the dynamics of economic systems or the reasons for their behavior.

Flows go up and down, on and off, in all sorts of combinations, in response to stocks, not to other flows.your behavior-level analysis wouldn’t help you. You would have to dig into the system’s structure.

That’s why behavior-based econometric models are pretty good at predicting the near-term performance of the economy, quite bad at predicting the longer-term performance, and terrible at telling one how to improve the performance of the economy.

Linear Minds in a Nonlinear World

Nonlinearities are important not only because they confound our expectations about the relationship between action and response. They are even more important because they change the relative strengths of feedback loops. They can flip a system from one mode of behavior to another.

Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.

Nonexistent Boundaries

Side-effects no more deserve the adjective “side” than does the “principal” effect.

Systems rarely have real boundaries.

There are only boundaries of word, thought, perception, and social agreement—artificial, mental-model boundaries.The greatest complexities arise exactly at boundaries.

Everything physical comes from somewhere, everything goes somewhere, everything keeps moving.

Which is not to say that every model, mental or computer, has to follow each connection until it includes the whole planet.

Clouds are a necessary part of models that describe metaphysical flows.

If we’re to understand anything, we have to simplify, which means we have to make boundaries.

There is no single, legitimate boundary to draw around a system.

There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.

Systems analysts often fall into the opposite trap: making boundaries too large.

This “my model is bigger than your model” game results in enormously complicated analyses, which produce piles of information that may only serve to obscure the answers to the questions at hand.

The right boundary for thinking about a problem rarely coincides with the boundary of an academic discipline, or with a political boundary.

Boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose. It’s a challenge to stay creative enough to drop the boundaries that worked for the last problem and to find the most appropriate set of boundaries for the next question. It’s also a necessity, if problems are to be solved well.

Layers of Limits

At any given time, the input that is most important to a system is the one that is most limiting.

Growth itself depletes or enhances limits and therefore changes what is limiting.

To shift attention from the abundant factors to the next potential limiting factor is to gain real understanding of, and control over, the growth process.

The growing entity and its limited environment together form a coevolving dynamic system.

For any physical entity in a finite environment, perpetual growth is impossible.

Any physical entity with multiple inputs and outputs is surrounded by layers of limits. There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.

Ubiquitous Delays

We must learn to wait as we learn to create.ask everyone in the system how long they thought the delay was, make our best guess, and then multiply by three.

Delays are ubiquitous in systems. Every stock is a delay.

Most flows have delays—shipping delays, perception delays, processing delays, maturation.

What is a significant delay depends—usually—on which set of frequencies you’re trying to understand.

Delays determine how fast systems can react, how accurately they hit their targets, and how timely is the information passed around a system.

Overshoots, oscillations, and collapses are always caused by delays.

Understanding delays helps one understand why Mikhail Gorbachev could transform the information system of the Soviet Union virtually overnight, but not the physical economy. (That takes decades.) It helps one see why the absorption of East Germany by West Germany produced more hardship over a longer time than the politicians foresaw.

Human fossil-fuel emissions have already induced changes in climate that will not be fully revealed for a generation or two.

When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.

Bounded Rationality

Invisible hand.

By pursuing his own interest he frequently promotes that of society more effectually than when he really intends to promote it.

—Adam Smith, 18th-century political economist.

It would be so nice if the “invisible hand” of the market really did lead individuals to make decisions that add up to the good of the whole.

Unfortunately, the world presents us with multiple examples of people acting rationally in their short-term best interests and producing aggregate results that no one likes.

Because of what World Bank economist Herman Daly calls the “invisible foot” or what Nobel Prize–winning economist Herbert Simon calls bounded rationality.

We are not omniscient, rational optimizers, says Simon. Rather, we are blundering “satisficers,” attempting to meet (satisfy) our needs well enough (sufficiently) before moving on to the next decision.

We don’t even make decisions that optimize our own individual good, much less the good of the system as a whole.

Neither of these assumptions stands up long against the evidence.

  • As simulated fishermen, they over fish
  • As ministers of simulated developing nations, they favor the needs of their industries over the needs of their people
  • As the upper class, they feather their own nests; as the lower class, they become apathetic or rebellious.
  • So would you.

Seeing how individual decisions are rational within the bounds of the information available does not provide an excuse for narrow-minded behavior.

Taking out one individual from a position of bounded rationality and putting in another person is not likely to make much difference. Blaming the individual rarely helps create a more desirable outcome.

Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires.

INTERLUDE • Electric Meters in Dutch Houses

The difference, it turned out, was in the position of the electric meter.

The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.

FIVE — System Traps . . . and Opportunities

System structures that produce such common patterns of problematic behavior archetypes.

The destruction they cause is often blamed on particular actors or events, although it is actually a consequence of system structure.

But system traps can be escaped—by recognizing them in advance and not getting caught in them, or by altering the structure—by reformulating goals, by weakening, strengthening, or altering feedback loops, by adding new feedback loops.

Policy Resistance— Fixes that Fail

Intensification of anyone’s effort leads to the intensification of everyone else’s. It’s hard to reduce intensification. It takes a lot of mutual trust to say, OK, why don’t we all just back off for a while?

The alternative to overpowering policy resistance is so counterintuitive that it’s usually unthinkable. Let go. Give up ineffective policies. Let the resources and energy spent on both enforcing and resisting be used for more constructive purposes.

If you calm down, those who are pulling against you will calm down too.

Harmonization of goals in a system is not always possible, but it’s worth looking for. It can be found only by letting go of more narrow goals and considering the long term welfare of the entire system.

THE TRAP: POLICY RESISTANCE

When various actors try to pull a system stock toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the stock farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining.

THE WAY OUT

Let go. Bring in all the actors and use the energy formerly expended on resistance to seeking out mutually satisfactory ways for all goals to be realized—or redefinitions of larger and more important goals that everyone can pull toward together.

The Tragedy of the Commons

The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
If you think that the reasoning of an exploiter of the commons is hard to understand, ask yourself how willing you are to carpool in order to reduce air pollution or to clean up after yourself whenever you make a mess. The structure of a commons system makes selfish behavior much more convenient and profitable than behavior that is responsible to the whole community and to the future.

There are three ways to avoid the tragedy of the commons.

  • Educate and exhort. Help people to see the consequences of unrestrained use of the commons. Appeal to their morality. Persuade them to be temperate. Threaten transgressors with social disapproval or eternal hellfire.
  • Privatize the commons. Divide it up, so that each person reaps the consequences of his or her own actions. If some people lack the self-control to stay below the carrying capacity of their own private resource, those people will harm only themselves and not others.
  • Regulate the commons. Garrett Hardin calls this option, bluntly, “mutual coercion, mutually agreed upon.” Regulation can take many forms, from outright bans on certain behaviors to quotas, permits, taxes, incentives. To be effective, regulation must be enforced by policing and penalties.

THE TRAP: TRAGEDY OF THE COMMONS

When there is a commonly shared resource, every user benefits directly from its use but shares the costs of its abuse with everyone else. Therefore, there is very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is the overuse of the resource, eroding it until it becomes unavailable to anyone.

THE WAY OUT

Educate and exhort the users, so they understand the consequences of direct consequences of its abuse or (since many resources cannot be privatized) by regulating the access of all users to the resource.

Drift to Low Performance

The actor tends to believe bad news more than good news.the desired state of the system is influenced by the perceived state.

Eroding goals.” It is also called the “boiled frog syndrome.

THE TRAP: DRIFT TO LOW PERFORMANCE

Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.

THE WAY OUT

Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!

Escalation

The only other graceful way out of the escalation system is to negotiate disarmament. That’s a structural change, an exercise in system design.

THE TRAP: ESCALATION

When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever.

THE WAY OUT

The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation.

Success to the Successful— Competitive Exclusion

The competitive exclusion principle. This principle says that two different species cannot live in exactly the same ecological niche, competing for exactly the same resources.

That will happen not by direct confrontation usually, but by appropriating all the resources, leaving none for the weaker competitor.

If there is a finite market and no antitrust law to stop it, one firm will take over everything as long as it chooses to reinvest in and expand its production facilities.

  • In most societies, the poorest children receive the worst educations in the worst schools, if they are able to go to school at all. With few marketable skills, they qualify only for low paying jobs, perpetuating their poverty.
  • People with low income and few assets are not able to borrow from most banks. Therefore, either they can’t invest in capital improvements, or they must go to local moneylenders who charge exorbitant interest rates. Even when interest rates are reasonable, the poor pay them, the rich collect them.
  • Land is held so unevenly in many parts of the world that most farmers are tenants on someone else’s land. They must pay part of their crops to the landowner for the privilege of working the land, and so never are able to buy land of their own. The landowner uses income from tenants to buy more land.

Because they are often unorganized and inarticulateDiversification doesn’t work as a strategy for the poor.

THE TRAP: SUCCESS TO THE SUCCESSFUL

If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.

THE WAY OUT

Diversification, which allows those who are losing the competition to get out of that game and start another one; strict limitation on the fraction of the pie any one winner may win (antitrust laws); policies that level the playing field, removing some of the advantages of the strongest players or increasing the advantage of the weakest; policies that devise rewards for success that do not bias the next round of competition.

Shifting the Burden to the Intervenor— Addiction

THE TRAP: SHIFTING THE BURDEN TO THE INTERVENOR

Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem. If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in motion. The system deteriorates; more and more of the solution is then required.

The system will become more and more dependent on the intervention and less and less able to maintain its own desired state.

THE WAY OUT

Again, the best way out of this trap is to avoid getting in.

Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring. If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself. If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be.

Rule Beating

THE TRAP: RULE BEATING

Perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.

THE WAY OUT

Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.

Seeking the Wrong Goal

If the desired system state is national security, and that is defined as the amount of money spent on the military, the system will produce military spending.

These examples confuse effort with result, one of the most common mistakes in designing systems around the wrong goal.

Maybe the worst mistake of this kind has been the adoption of the GNP as the measure of national economic success.

The GNP lumps together goods and bads.

It measures effort rather than achievement, gross production and consumption rather than efficiency.

New light bulbs that give the same light with one-eighth the electricity and that last ten times as long make the GNP go down.

It could be argued that the best society would be one in which capital stocks can be and used with the lowest possible throughput, rather than the highest.

THE TRAP: SEEKING THE WRONG GOAL

System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.

THE WAY OUT

Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.

INTERLUDE • The Goal of Sailboat Design

No one would think of using an America’s Cup yacht for any purpose other than racing within the rules. The boats are so optimized around the present rules that they have lost all resilience. Any change in the rules would render them useless.

PART THREE Creating Change—in Systems and in Our Philosophy

SIX — Leverage Points— Places to Intervene in a System

So, how do we change the structure of systems to produce more of what we want and less of that which is undesirable?

Leverage points are points of power.

What is needed is much slower growth, very different kinds of growth, and in some cases no growth or negative growth.

Counterintuitive—that’s Forrester’s word to describe complex systems. Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.

But complex systems are, well, complex. It’s dangerous to generalize about them.

12. Numbers—Constants and parameters such as subsidies, taxes, standards

Putting different hands on the faucets may change the rate at which the faucets turn, but if they’re the same old faucets, plumbed into the same old system, turned according to the same old information and goals and rules, the system behavior isn’t going to change much.

But changing these variables rarely changes the behavior of the national economy system.

Parameters become leverage points when they go into ranges that kick off one of the items higher on this list. Interest rates, for example, or birth rates, control the gains around reinforcing feedback loops. System goals are parameters that can make big differences.

11. Buffers—The sizes of stabilizing stocks relative to their flows

You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones.a big, stabilizing stock is known as a buffer.

You can often stabilize a system by increasing the capacity of a buffer. But if a buffer is too big, the system gets inflexible. It reacts too slowly. And big buffers of some sorts, such as water reservoirs or inventories, cost a lot to build or maintain.

10. Stock-and-Flow Structures—Physical systems and their nodes of intersection

The only way to fix a system that is laid out poorly is to rebuild it, if you can.

Physical structure is crucial in a system but is rarely a leverage point because changing it is rarely quick or simple.

The leverage point is in proper design in the first place.

9. Delays—The lengths of time relative to the rates of system changes

A system just can’t respond to short-term changes when it has long term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.

A delay in a feedback process is critical relative to rates of change in the stocks that the feedback loop is trying to control.

It’s usually easier to slow down the change rate, so that inevitable feedback delays won’t cause so much trouble.

And that’s why slowing economic growth is a greater leverage point in Forrester’s World model than faster technological development or freer market prices.

8. Balancing Feedback Loops—The strength of the feedbacks relative to the impacts they are trying to correct

A thermostat loop is the classic example.

The real leverage here is to keep them from doing it.

Billions of dollars are spent to limit and bias and dominate that flow of clear information.

The strength of a balancing feedback loop is important relative to the impact it is designed to correct. If the impact increases in strength, the feedbacks have to be strengthened too. A thermostat system may work fine on a cold winter day—but Democracy works better without the brainwashing power of centralized mass communications.

A global economy makes global regulations necessary.

  • preventive medicine,
  • integrated pest management
  • Freedom of Information Act
  • monitoring systems
  • protection for whistleblowers,
  • impact fees, pollution taxes, and performance bonds

7. Reinforcing Feedback Loops—The strength of the gain of driving loops

Reinforcing feedback loops are sources of growth, explosion, erosion, and collapse in systems.

Reducing the gain around a reinforcing loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening balancing loops, and far more preferable than letting the reinforcing loop run.

Look for leverage points around birth rates, interest rates, erosion rates, “success to the successful” loops, any place where the more you have of something, the more you have the possibility of having more.

6. Information Flows—The structure of who does and does not have access to information

Contrary to economic opinion, the price of fish doesn’t provide that feedback. As the fish get more scarce they become more expensive, and it becomes all the more profitable to go out and catch the last few.

It is not price information but population information that is needed.

There is a systematic tendency on the part of human beings to avoid accountability for their own decisions.

5. Rules—Incentives, punishments, constraints

Suppose the students graded the teachers, or each other. Suppose there were no degrees: You come to college when you want to learn something, and you leave when you’ve learned it.Suppose to solve real world problems, rather than to publish academic papers. Suppose a class got graded as a group, instead of as individuals.

If you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them.

4. Self-Organization—The power to add, change, or evolve system structure

The most stunning thing living systems and some social systems can do is to change themselves utterly by creating whole new structures and behaviors. In biological systems that power is called evolution. In human economies it’s called technical advance or social revolution. In systems lingo it’s called self-organization.

Self-organization means changing any aspect of a system lower on this list.

The ability to self-organize is the strongest form of system resilience.

A system that can evolve can survive almost any change, by changing itself.

Self-organization, you begin to understand why biologists worship biodiversity even more than economists worship technology.

One aspect of almost every culture is the belief in the utter superiority of that culture.

Insistence on a single culture shuts down learning and cuts back resilience.

Any system, biological, economic, or social, that gets so encrusted that it cannot self-evolve, a system that systematically scorns experimentation and wipes out the raw material of innovation, is doomed over the long term on this highly variable planet.

The intervention point here is obvious but unpopular. Encouraging variability and experimentation and diversity means “losing control. Let a thousand flowers bloom and anything could happen! Who wants that? Let’s play it safe and push this lever in the wrong direction by wiping out biological, cultural, social, and market diversity!

3. Goals—The purpose or function of the system

That’s why I can’t get into arguments about whether genetic engineering is a “good” or a “bad” thing.

Like all technologies, it depends on who is wielding it, with what goal.

John Kenneth Galbraith recognized that corporate goal—to engulf everything—long ago. 6 It’s the goal of a cancer too.

2. Paradigms—The mind-set out of which the system—its goals, structure, rules, delays, parameters—arises

Paradigms are the sources of systems.

From them, from shared social agreements about the nature of reality, come system goals and information flows, feedbacks, stocks, flows, and everything else about systems.

So how do you change paradigms? Thomas Kuhn,

Systems modelers say that we change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. I say that because my own paradigms have been changed that way.

1. Transcending Paradigms

Surely there is no power, no control, no understanding, not even a reason for being, much less acting, embodied in the notion that there is no certainty in any worldview.

If no paradigm is right, you can choose whatever one will help to achieve your purpose.

If you have no idea where to get a purpose, you can listen to the universe.
The higher the leverage point, the more the system will resist changing it—that’s why societies often rub out truly enlightened beings.

There are no cheap tickets to mastery. You have to work hard at it, whether that means rigorously analyzing a system or rigorously casting off your own paradigms and throwing yourself into the humility of not-knowing.

In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system.

SEVEN — Living in a World of Systems

They control. This mistake is likely because the mindset of the industrial world assumes that there is a key to prediction and control.

We learned that it’s one thing to understand how to fix a system and quite another to wade in and fix it.

Social systems are the external manifestations of cultural thinking patterns and of profound human needs, emotions, the good shall do the good.

A systems insight . . . can raise more questions!
What was unique about our search was not our answers or even our questions, but the fact that the tool of systems thinking, born out of engineering and mathematics, implemented in computers, drawn from a mechanistic mindset and a quest for prediction and control, leads its practitioners, inexorably I believe, to confront the most deeply human mysteries.

Systems thinking makes clear even to the most committed technocrat that getting along in this world of complex systems requires more than technocracy.

Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable.

The idea of making a complex system do just way our reductionist science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize. We can’t keep track of everything. We can’t find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.

For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take. If you can’t understand, predict, and control, what is there to do?

The future can’t be predicted, but it can be envisioned and brought lovingly into being.

Systems can’t be controlled, but they can be designed and redesigned.

We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them.

We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.

We can’t control systems or figure them out. But we can dance with them!

Stay wide awake, pay close attention, participate flat out, and respond to feedback.

Our ability to sort out truth from falsehood: Critical Thinking, “systems wisdoms.”

Get the Beat of the System

Before you disturb the system in any way, watch how it behaves.

Watching what really happens, instead of listening to peoples’ theories of what happens, can explode many careless causal hypotheses

Expose Your Mental Models to the Light of Day

Remember, always, that everything you know, and everything everyone knows, is only a model.

Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible.

Honor, Respect, and Distribute Information

Most of what goes wrong in systems goes wrong because of biased, late, or missing information

Thou shalt not distort, delay, or withhold information.

Through the Freedom of Information Act (from a systems point of view, one of the most important laws in the nation.

Information is power. Anyone interested in power grasps that idea very quickly.

Use Language with Care and Enrich It with Systems Concepts

Avoiding language pollution—making the cleanest possible use we can of language expanding our language so we can talk about complexity.

We don’t talk about what we see; we see only what we can talk about.

To reshape the measurement and communication systems of a [society] is to reshape all potential interactions at the most fundamental level.

Language . . . as articulation of reality is more primordial than strategy, structure, or . . . culture.

The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible.

The second step is to enlarge language to make it consistent with our enlarged understanding of systems.

If the Eskimos have so many words for snow, it’s because they have studied and learned how to use snow.

They have turned snow into a resource, a system with which they can dance.

My word processor has spell-check capability, which lets me add words that didn’t originally come in its comprehensive dictionary.

Pay Attention to What Is Important, Not Just What Is Quantifiable

Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models.

If something is ugly, say so.

Make Feedback Policies for Feedback Systems

Carter also was trying to deal with a flood of illegal immigrants from Mexico. He suggested that nothing could be done about that immigration as long as there was a great gap in opportunity and living standards between the United States and Mexico. Rather than spending money on border guards and barriers, he said, we should spend money helping to build the Mexican economy, and we should continue to do so until the immigration stopped.

You can imagine why a dynamic, self-adjusting feedback system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system.

Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops—loops that alter, correct, and expand loops. These are policies that design learning into the management process.

Go for the Good of the Whole

Remember that hierarchies exist to serve the bottom layers, not the top.

Don’t maximize parts of systems or subsystems while ignoring the whole.

Don’t, as Kenneth Boulding once said, go to great trouble to optimize something that never should be done at all.

Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability—whether they are easily measured or not.

Listen to the Wisdom of the System

Locate Responsibility in the System

privatizing a commons!

A great deal of responsibility was lost when rulers who declared war were no longer expected to lead the troops into battle

how little our current culture has come to look for responsibility within the system that generates an action, and how poorly we design systems to experience the consequences of their actions.

Stay Humble— Stay a Learner

Systems thinking has taught me to trust my intuition more and my figuring-out rationality less, to lean on both as much as I can, but still to be prepared for surprises.

The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn.

The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error.

“Stay the course” is only a good idea if you’re sure you’re on course.

What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.That’s hard. It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls “error-embracing.” It takes a lot of courage to embrace your errors.

The very act of acknowledging uncertainty could help greatly to reverse this worsening trend.

Error-embracing is the condition for learning.It means seeking and using—and sharing—information about what went wrong with what you expected or hoped would go right.

Both error embracing and living with high levels of uncertainty emphasize our personal as well as societal vulnerability.

Typically we hide our vulnerabilities from ourselves as well as from others.

But …to be the kind of person who truly accepts his responsibility …requires knowledge of and access to self far beyond that possessed by most people in this society.

Celebrate Complexity

“A thing is right when it tends to preserve the integrity, stability, and beauty of the biotic community. It is wrong when it tends otherwise.”

Expand Time Horizons

A society which loses its identity with posterity and which loses its positive image of the future loses also its capacity to deal with present problems, and soon falls apart.

Systems are always coupling and uncoupling the large and the small, the fast and the slow.

You need to be watching both the short and the long term—the whole system.

Defy the Disciplines

Follow a system wherever it leads.

Seeing systems whole requires more than being “interdisciplinary,” if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other.

Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than being academically correct.

They will have to go into learning mode.

They will have to admit ignorance and be willing to be taught, by each other and by the system.

Expand the Boundary of Caring

Don’t Erode the Goal of Goodness

The public discourse is full of cynicism.

It is much easier to talk about hate in public than to talk about love.

Don’t weigh the bad news more heavily than the good.

And keep standards absolute.

Systems thinking can only tell us to do that. It can’t do it.

We’re back to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap, but it can lead us to the edge of what analysis can do and then point beyond—to what can and must be done by the human spirit.