Changing elements — e.g. the players on the team — usually has the least effect on the system's
behavior. Changing interconnections, and function, usually have much larger effects.
The delay in changing stocks is why it takes so long for a business to gain traction. The stocks
of employees, customers, and word of mouth accumulate slowly.
"Especially complex and sophisticated are the mental models we develop from direct, intimate
experience of nature, people, and organizations immediately around us." These are superior to
conjectured models.
"When a systems thinker encounters a problem, the first thing he or she does is look for data,
time graphs, the history of the system. That's because long-term behavior provides clues to the
underlying system structure. And structure is the key to understanding not just what is
happening, but why."
"Starting with history discourages the common and distracting tendency we all have to define a
problem not by the system's actual behavior, but by the lack of our favorite solution."
"Expose your mental models to the light of day"
By writing them down and discussing them. This helps improve the model to be more consistent and
rigorous. Mental models alone are slippery.
Introduction
Argues these problems are the symptoms of "messes." Messes result from the inherent structure of
complex systems:
"Hunger, poverty, environmental degradation, economic instability, unemployment... no one
deliberately creates those problems, no one wants them to persist, but they persist
nonetheless."
"Words and sentences must, by necessity, come only one at a time in linear, logical order. Systems
happen all at once. They are connected not just in one direction, but in many directions
simultaneously."
So pictures and graphs must be used. (This book has many systems diagrams, which look like UML
diagrams.)
Systems theory is a complementary lens to the observant human eye, and the detail-revealing
microscope. It's not a superior lens. "The more ways of seeing, the better."
The basics (chap 1)
System: elements interconnected to achieve a function. E.g. a digestive system.
It's easy to see the elements of a system. It's harder to see and understand the interconnections
— the physical flows, or flows of information, between elements.
Idiom: "function" is used for nonhuman systems, "purpose" for human systems.
"Systems can be nested within systems. Therefore, there can be purposes within purposes."
E.g. "the university", which is composed of each population / stakeholder group within it.
Changing elements — e.g. the players on the team — usually has the least effect on the system's
behavior. Changing interconnections, and function, usually have much larger effects.
Stocks: foundation of a system. These are elements that are a store, a measurable quantity. The
quantities need not be physical; "accumulated goodwill" is a stock.
Flows: "Stocks change over time through the action of a flow. Flows are filling and draining,
births and deaths, purchases and sales, growth and decay, deposits and withdrawals."
If the sum of inflows is equal to the sum of outflows of a stock, the stock's level remains
constant.
For a labor force, you can increase its rate of growth by increasing hiring (inflow), or by
reducing churn (outflow).
"A stock takes time to change, because flows take time to flow. That's a vital point, a key to
understanding why systems behave as they do."
"Stocks generally change slowly, even when the flows into or out of them change suddenly.
Therefore, stocks act as delays or buffers or shock absorbers in systems."
The delay in changing stocks is why it takes so long for a business to gain traction. The stocks
of employees, customers, and word of mouth accumulate slowly.
Inflows and outflows are made independent when a stock is used as a buffer in the system. It adds
stability and predictability. Oil field -> oil reserves -> gas at pump.
"Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating
the levels in the stocks by manipulating flows."
Feedback loops
Stabilizing loops
A loop which tends to keep a stock's value within a range.
E.g. a coffee drinker's stock of energy will be maintained at a desired level by drinking or
avoiding caffeine.
Runaway loops — reinforcing feedback
Also called "snowballing" feedback loops.
"It generates more input to a stock the more that is already there (and less input the less
that is already there). A reinforcing feedback loop enhances whatever direction of change is
imposed on it."
Self-reinforcing: stocks with exponential growth properties. They can increase at a constant
percentage of their own value, so absolute change in value keeps growing with time. Population
growth, for example.
Doubling time: the time it takes for an exponentially growing stock to double in size is 70 /
growth rate.
"You'll stop looking for who's to blame; instead you'll start asking, 'What's the system?' The
concept of feedback opens up the idea that a system can cause its own behavior."
A brief visit to the systems zoo (chap 2)
In thermostat-like systems, one must take into account whatever draining or filling processes are
affecting the stock. In a furnace for instance, the room's temperature is a result of furnace heat
and heat loss to the outside, and as the furnace heats the room, it increases the rate of heat
loss to the outside.
"A stock with one reinforcing loop and one balancing loop: population and industrial economy"
One loop in the system may be stronger / dominant. Which loop is stronger may change with time
("shifting dominance"), as fertility and mortality rates do in population systems.
In a population system, the population stock stabilizes when fertility equals mortality, and the
system reaches an equilibrium.
The investment -> capital stock -> depreciation system has similar structure as a population
system.
"A system with delays: business inventory"
E.g. changing the staffing level in a company.
(Diagrams of these systems)
Lengthening and shortening delays using policy levers have powerful effects on the feedback and
oscillations in output.
"Economies are extremely complex systems; they are full of balancing feedback loops with delays,
and they are inherently oscillatory."
This is the cause of business cycles. The economy is a more complex version of the car
dealership trying to maintain its inventory in relation to sales and manufacturing delays.
"A renewable stock constrained by a nonrenewable stock — an oil economy"
In such systems, there is a reinforcing loop (e.g. exponential product adoption) and
constraining loop (market saturation).
"In physical, exponentially growing systems, there must be at least one reinforcing loop driving
the growth and at least one balancing loop constraining the growth, because no physical system
can grow forever in a finite environment."
In a fish harvesting system, equilibrium is achieved when the harvest rate is equal to the
natural replenishment rate. If the harvesting rate is larger, then the fish will be
over-harvested and eventually eradicate all of the fish, turning this renewable resource into a
non-renewable one.
Why systems work so well (chap 3)
Resilience: "the ability to bounce or spring back into shape, position, etc., after being pressed
or stretched. Elasticity. The ability to recover strength, spirits, good humor, or any other
aspect quickly."
"Resilience is a measure of a system's ability to survive and persist within a variable
environment. The opposite of resilience is brittleness or rigidity."
Feedback loops provide resilience in that they restore a system or set of stocks to their desired
state after a large perturbation. Feedback loops which can create new feedback loops provide
meta-resilience.
"Because resilience may not be obvious without a whole-system view, people often sacrifice
resilience for stability, or for productivity, or for some other more immediately recognizable
system property."
Self-organizing systems
Evolutionary systems, like life forms and societies, are hard to predict and model.
Self-organization produces heterogeneity and unpredictability. It is likely to come up with
whole new structures, whole new ways of doing things. It requires freedom and experimentation,
and a certain amount of disorder."
Self-organizing systems generate hierarchy. There are subsystems, aggregated into larger
systems.
Balanced hierarchy: "there must be enough central control to achieve coordination toward the
large-system goal, and enough autonomy to keep all subsystems flourishing, functioning, and
self-organizing."
Why systems surprise us (chap 4)
(An inventory of ways our mental models often fail to capture the complexity of reality.)
"The acquisition of knowledge always involves the revelation of ignorance — almost is the
revelation of ignorance. Our knowledge of the world instructs us first of all that the world is
greater than our knowledge of it." - Wendell Berry
"Especially complex and sophisticated are the mental models we develop from direct, intimate
experience of nature, people, and organizations immediately around us." These are superior to
conjectured models.
"When a systems thinker encounters a problem, the first thing he or she does is look for data,
time graphs, the history of the system. That's because long-term behavior provides clues to the
underlying system structure. And structure is the key to understanding not just what is
happening, but why."
"Systems thinking goes back and forth constantly between structure (diagrams of stocks, flows,
and feedback) and behavior (time graphs)."
Interesting case study of budworm outbreaks, which consume fir trees. Outbreaks happen
periodically because the system slowly oscillates between less fir trees and more, based on how
long since the last outbreak. Once fir trees hit a critical mass, there's a non-linearity: worm
larvae multiply faster than natural predators can grow to consume them, and their population
explodes, until they eat and reduce the fir tree population.
This is a "system with unintuitive nonlinearities."
A system may be constrained by any number of input factors.
"Rich countries transfer capital or technology to poor ones and wonder why the economies of the
receiving countries still don't develop, never thinking that capital or technology may not be
the most limiting factors."
Layers of limits (bottlenecks)
"There are layers of limits around every growing plant, child, epidemic, new product,
technological advance, company, city, economy, and population. Insight comes not only from
recognizing which factor is limiting, but from seeing that growth itself depletes or enhances
limits and therefore changes what is limiting."
E.g. hiring more salespeople may shift the bottleneck to engineering.
Bounded rationality: humans in a system have incomplete information, and imperfect competence: we
underestimate risk, have bias, interpret our information incorrectly. So no actor optimally acts
for his own self good, or the good of the system."
System traps... and opportunities (chap 5)
"The world is nonlinear. Trying to make it linear for our mathematical or administrative
convenience is not usually a good idea even when feasible."
Some systems are perverse: their structure produces truly problematic behavior.
"The destruction they cause is often blamed on particular actors or events, although it is
actually a consequence of system structure."
Policy resistance: the balancing loop in the system maintains its behavior despite changing
outside forces.
"This system structure can operate in a ratchet mode: intensification of anyone's effort leads
to intensification of everyone else's."
Efforts to make shallow changes to fix them feel like swimming upstream.
E.g. drug enforcement -> evasion
Each action in a system pulls it toward that actor's goals. If one actor manages to move the
value of a stock towards their goals, the others will pull harder, until it's back to
equilibrium.
"If you calm down, those who are pulling against you will calm down too. This is what happened
in 1933 when Prohibition ended in the United States; the alcohol-driven chaos also largely
ended."
"The most effective way of dealing with policy resistance is to find a way of aligning the
various goals of the subsystems, usually by providing an overarching goal that allows all actors
to break out of their bounded rationality."
Tragedy of the commons
"In any commons system there is, first of all, a resource that is commonly shared (the pasture).
For the system to be subject to tragedy, the resource must not only be limited, but erodable
when overused."
"The structure of a commons system makes selfish behavior much more convenient and profitable
than behavior that is responsible to the whole community and to the future."
"Every user benefits directly from its use, but shares the costs of its abuse with everyone
else. Therefore, there is very weak feedback from the condition of the resource to the decisions
of the resource users. The consequence is overuse of the resource, eroding it until it becomes
unavailable to anyone."
How to avoid the tragedy of the commons
"Educate and exhort. Help people see the consequences of unrestrained use of the commons."
Privatize the commons. "Divide it up, so that each person reaps the consequences of his or her
own actions." Although many resources, like the air and the fish in the sea, cannot be
privatized.
Regulate the commons. "Mutual coercion, mutually agreed upon."
Drift to low performance
Systems that keep getting worse. E.g. a company with falling market share.
"Eroding goals" is a label for such systems.
"Allowing performance standards to be influenced by past performance, especially if there is a
negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding
goals that sets a system drifting towards low performance."
E.g. on a losing sports team, set the standard/goal for the system using the best performances
of the past, not the current (poor) performance, which introduces a positive mindset and pull-up
effect.
Escalation
When the goal of one part of the system, or one actor, is relative to another part.
"Refuse to compete", interrupting the reinforcing loop that's creating escalation.
"One can negotiate a new system with balancing loops to control the escalation." An unpleasing
process for the parties involved, but if achieved, creates a better system and long-term
outcomes.
Success to the successful: competitive exclusion
"This system trap is found whenever the winners of a competition receive, as part of the reward,
the means to compete even more effectively in the future. That's a reinforcing feedback loop,
which rapidly divides a system into winners who go on winning, and losers who go on losing."
Mature markets do not seem to support large numbers of firms. Whichever firm gets an edge
reinvests that to grow their edge.
"In every market economy, we see long-term trends of declining numbers of farms, while the size
of farms increases."
Self-perpetuating wealth gap: children of poor parents get worse education and fewer
opportunities to advance.
"The success-to-the-successful loop can be kept under control by putting into place feedback
loops that keep any competitor from taking over entirely. That's what antitrust laws do in
theory and sometimes in practice."
A way out is "periodically 'leveling the playing field.' Traditional societies and game
designers instinctively design into their systems some way of equalizing advantages, so the game
stays fair and interesting."
E.g. inheritance taxes, so that winners get reset each generation.
Diversification: pick or invent a new market where you're not competing against someone who
already has a growing edge over you. Losers can "get out of the game and start a new one."
Shifting the burden to the intervenor — addiction
I.e. a system/actor has a "dependence" on some policy or resource.
Burden-shifting systems: "Modern medicine in general has shifted the responsibility for health
away from the practices and lifestyle of each individual and onto intervening doctors and
medicines."
"The trap is formed if the intervention, whether by active destruction of simple neglect,
undermines the original capacity of the system to maintain itself. If that capability atrophies,
then more of the intervention is needed to achieve the desired effect."
Soil fertility and fertilizers. The intervenor unwittingly creates an ever-increasing
dependency.
"The individual or community that is being helped may not think through the long-term loss of
control and the increased vulnerability that go along with the opportunity to shift a burden to
an able and powerful intervenor."
Rather than shoulder the burden, an intervenor can find a way to strengthen the ability of the
system to shoulder its own burdens. Questions to ask:
"Why are the natural correction mechanisms failing?
How can obstacles to their success be removed?
How can these mechanisms be made more effective?"
Rule beating
Distorting behavior so that rules appear to be followed, but not in spirit.
E.g. killing endangered species on your own land, so that the land can be developed.
Laws must be designed to keep in mind the system's self-organizing evasive behaviors in mind.
"If the desired system state is national security, and that is defined as the amount of money
spent on the military, the system will produce military spending."
"If you define the goal of a society as GNP, that society will do its best to produce GNP. It
will not produce welfare, equity, justice, or efficiency."
"The world would be a different place if instead of competing to have the highest per capita
GNP, nations competed to have the highest per capita stocks of wealth with the lowest
throughput."
(By throughput, I think the author means the lowest GNP)
Leverage points - places to intervene in a system (chap 6)
"Places in a system where a small change could lead to a large shift in behavior."
Argues systems are complex, and even when we can identify the leverage points, we use them in the
wrong direction.
"Numbers, the sizes of flows, are dead last on my list of powerful interventions."
"Probably 90 — no 95, no 99 percent — of our attention goes to parameters, but there's not a lot
of leverage in them."
"If the system is chronically stagnant, parameter changes rarely kick-start it. If it's wildly
variable, they usually don't stabilize it."
Impactful levers, in ascending order of leverage
Buffers
They are powerful stabilizing forces. E.g. lake vs. river for handling increased water flow
due to rainfall. Increasing the buffer sizes adds stability, at the expense of responsiveness
and efficiency.
E.g. retailers carrying large inventories.
Stock and flow structures: the plumbing between stocks. Obviously impactful, if it can actually
be changed.
"Physical structure is crucial in a system, but is rarely a leverage point, because changing
it is rarely quick or simple. The leverage point is in proper design in the first place. After
the structure is built, the leverage is in understanding its limitations and bottlenecks."
Delays: the lengths of time relative to the rates of system changes
Delays can be in the observable state of a stock, or our response to it. Delays cause
oscillations around the goal.
This is a low-leverage lever because delays are not usually changeable. "Things take as long
as they take."
"It's usually easier to slow down the change rate, so that inevitable feedback delays won't
cause so much trouble. That's why growth rates are higher up on the leverage point list than
delay times."
Balancing feedback loops: the strength of the feedback relative to the impacts they are trying
to correct
E.g. the quickness and power of the response in a thermostat system.
Reinforcing feedback loops: the strength of the gain of driving loops
"A reinforcing feedback loop is self-reinforcing. The more it works, the more it gains power
to work some more."
It must be eventually checked by a balancing feedback loop or the system will destroy itself.
Reducing the gain around a reinforcing loop is usually more effective than strengthening a
balancing loop.
"Look for leverage points around birth rates, interest rates, erosion rates, 'success to the
successful' loops, any place where the more you have of something, the more you have the
possibility of having more."
Information flows: the structure of who does and does not have access to information
"Missing information flows is one of the most common causes of system malfunction. Adding or
restoring information can be a powerful intervention, usually much easier and cheaper than
rebuilding physical infrastructure."
Adding an information flow adds a new loop to the system.
Rules: incentives, punishments, constraints
"If you want to understand the deepest malfunctions of systems, pay attention to the rules and
to who has power over them."
Self-organization: the power to add, change, or evolve system structure
"One aspect of almost every culture is the belief in the utter superiority of that culture.
Insistence on a single culture shuts down learning and cuts back resilience."
Goals: the purpose or function of the system
"Changing the players in the system is a low-level intervention, as long as the players fit
into the same old system. The exception to that rule is at the top, where a single player can
have the power to change the system's goal."
Changing the goal of a system is a powerful lever; all stocks and flows will contort to align
with it.
Paradigms: the mindset out of which the system — its goals, structure, rules, delays,
parameters — arises
"So how do you change paradigms? Thomas Kuhn, who wrote the seminal book about the great
paradigm shifts of science, has a lot to say about that. You keep pointing at the anomalies
and failures in the old paradigm. You keep speaking and acting, loudly and with assurance,
from the new one. You insert people with the new paradigm in places of public visibility and
power. You don't waste time with reactionaries; rather, you work with active change agents and
with the vast middle ground of people who are open-minded."
Transcending paradigms
Meaning "to keep oneself unattached in the area of paradigms, to stay flexible, to realize
that no paradigm is 'true,' that every one, including the one that sweetly shapes your own
worldview, is a tremendously limited understanding of an immense and amazing universe that is
far beyond human comprehension."
"The higher the leverage point, the more the system will resist changing it — that's why
societies often rub out truly enlightened beings."
Living in a world of systems (chap 7)
"It's one thing to understand how to fix a system and quite another to wade in and fix it."
Even once you understand addiction loops, you cannot quit coffee.
"Social systems are the external manifestations of cultural thinking patterns and of profound
human needs, emotions, strengths, and weaknesses. Changing them is not as simple as saying 'now
all change,' or of trusting that he who knows the good shall do the good."
All serious systems thinkers learn that they can't fully understand large nonlinear systems:
variance in behavior is high, there's irreducible complexity. You can only learn from systems, and
try redesigning them.
"Get the beat of the system." Review the time series data it has generated, and watch it live.
Watch how it behaves: "this guideline is deceptively simple. Until you make it a practice, you
won't believe how many wrong turns it helps you avoid. Starting with the behavior of the system
forces you to focus on facts, not theories."
"Starting with history discourages the common and distracting tendency we all have to define a
problem not by the systems' actual behavior, but by the lack of our favorite solution."
(Yes; this an easy error to make when looking how to apply technology to a domain.)
"Expose your mental models to the light of day"
By writing them down and discussing them. This helps improve the model to be more consistent and
rigorous. Mental models alone are slippery.
"You can do it with words or lists or pictures or arrows showing what you think is connected to
what. The more you do that, in any form, the clearer your thinking will become, the faster you
will admit your uncertainties and correct your mistakes, and the more flexible you will learn to
be."
"You can make a system work better with surprising ease if you can give it more timely, more
accurate, more complete information."
"Language pollution" (fun term)
That which doesn't convey clear, precise, forceful meaning.
"Use language with care and enrich it with systems concepts."
Don't over-index on quality metrics when inspecting and evaluating systems.
"Human beings have been endowed not only with the ability to count, but also with the ability to
assess quality. Be a quality detector."
Local responsibility in the system
"'Intrinsic responsibility' means that the system is designed to send feedback about the
consequences of decision making directly and quickly and compellingly to the decision makers.
Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically
responsible. He or she will experience directly the consequences of his or her decisions."
Having smoking companies pay for smoking-related healthcare costs.
"Nature designs in fractals, with intriguing detail on every scale from the microscopic to the
macroscopic."