The ABC of Systems Thinking by Donella Meadows. Book summary: Donella Meadows - The ABC of Systems Thinking

I first heard about the book “The ABC of Systems Thinking” in Bayram Annakov’s speech at the Product Camp 2013. Then I found it in the list of recommended books by Ilya Krasinsky. Then I had no choice left and I went to the store for her.

I would have been glad to read this book a few years ago (probably like most of the other books I write about here), when I was just starting to do analytics at Yandex.
Donella Meadows in her book gives the reader the opportunity to see the world, as a set of systems that interact with each other, influence each other, make up more complex systems, which in turn make up even more complex systems.

The ABC of Systems Thinking by Donella Meadows

"The ABC of Systems Thinking" will teach you to highlight the standard components of any of the systems: stocks, flows, feedback loops (reinforcing, balancing). You will also learn how to identify common archetypes of systems (a system with non-renewable resources, a system with renewable resources, and so on), learn about the main levers of influence on systems.

The book gave me a better understanding of many processes, structured my thoughts and once again made me understand how unusually complex the world around me is. Its complexity lies, first of all, in the field of interaction of its parts. By the way, the game “Life” (The Game of Life) very clearly shows this complexity, but more on that another time.

The introduction to the book "The ABC of Systems Thinking" contains "The Parable of the Blind Men and the Elephant". This parable vividly reflects our usual perception of reality. The book teaches to see the whole behind the separate parts.

Thoughts and ideas from the book

You think that if you know what one is, then you know what two is, because one and one will be two. But you forget that you must understand what "and" is.
Sufi parable

The fact that there are feedbacks in systems means that the system can be the cause of its own behavior.

The main task of any theory is to make the basic elements as simple and as few in number as possible without compromising an adequate representation of what we observe in practice.
Albert Einstein

The complex behavior of systems is often associated with the transition of dominance from one feedback loop to another.

Systems with the same feedback structures exhibit similar types of behavior.

The presence of delays in feedback loops causes the system to oscillate. This fact explains the tendency of economic systems to fluctuate.

An interesting example of how a seemingly obvious technological improvement ruined an entire industry. At some point, equipment for fishing boats was invented, with the help of which vessels could successfully go to sea even at low fish population density. Previously, they could not do this, and in the event of a decrease in population density in the industry, stagnation occurred for some time. The new equipment has improved efficiency and enabled smaller vessels to continue operating even in such conditions.
Unfortunately, the consequences were similar to those of desertification on land. The density of the fish population fell below a critical level, when it could no longer reproduce on its own. As a result, technological progress, which brought an increase in efficiency, destroyed the fisheries industry in a number of regions of the planet.

Why systems are so effective:
1) they are resistant to external influences
2) systems are capable of self-organization
3) have a hierarchical structure

Loss of stability is usually completely unexpected, since the system itself usually devotes all its attention to its actions, and not to the causes that underlie them. One day, the system will perform the usual actions that it has done many times, but this time they will lead to destruction.

The ability of systems to complicate their own structure is called self-organization. It requires freedom, the opportunity to experiment and some creative mess.

Hierarchy allows you to reduce the amount of information that needs to be stored and tracked by each part of the system.

The original purpose of the upper levels of the hierarchy is to help the lower levels achieve their goals. The purpose of each of the subsystems in the hierarchy structure must correspond to the overall purpose of the system. When the interests of a subsystem are achieved at the expense of the interests of the system as a whole, this leads to the destruction of the entire system (an example is a cancerous tumor).

Existing models correlate well with reality, but they are far from representing the world in its entirety. The fact that we live in a world of models that simplify the perception of a complex world creates the basis for unexpected and unpredictable behavior of systems that existing models cannot explain, and therefore cannot predict.

The structure of the system determines its behavior. The behavior of the system manifests itself in the form of events occurring in a certain sequence.
Thus, when studying a system, it is necessary to look for the structure of the system behind a series of events, and not try to connect the events themselves.

Most processes in the world develop according to non-linear laws. At the same time, most people tend to have linear thinking.

The main development of the economy took place at a time when capital and labor were the main limiting factors. Since then, the world has changed and now the more significant limiting factors are clean water, acceptable energy sources, waste disposal methods. But management is still carried out in terms of only capital and labor (by the way, this is an example of the delay in the management education system).

A real understanding of growth is to shift attention from factors that are in abundance to those that may become scarce, that is, those factors that will become limiting in the future. Only then can growth be truly managed.

The theory of bounded rationality: people make rational decisions within the framework of only the information that is available to them in this moment, and perfect information does not exist.

Leverage system influence (from weakest to most powerful):
12) Numerical indicators: variables, constants (tax level)
11) Stocks (the size of the stock has a stabilizing effect if it is large enough compared to the size of the flows)
10) Structure of stocks and flows (changes in the structure of roads in Moscow may lead to less congested traffic)
9) Lags (changing lags in feedback loops has a strong effect on the system)
8) Balancing feedback loops (depriving the system of feedback loops makes it unstable and unstable)
7) Reinforcing feedback loops (often slowing growth is many times more powerful leverage than strengthening balancing loops)
6) Information flows (who has access to information and who does not)
5) Rules, incentives, punishments, coercion (imagine what will happen to the university if you cancel the assessment of students' knowledge and introduce an assessment of the quality of teachers' work - the system will change dramatically)
4) Self-organization (adding, changing and transforming system structures; an example of the influence of self-organization on the system is the history of human development)
3) Goals, purpose and function of the system (Gorbachev did not change the country, but changed the global goal of the country and society)

2) The worldview within which the system is built

Here I want to stop in more detail. The main idea of ​​this leverage is as follows: any expansion of the system of concepts within which the system "thinks" can lead to strong changes in the external manifestations of the system's behavior.
We build skyscrapers because we believe downtown areas are expensive, the Egyptians built the pyramids because they believed in an afterlife.
Einstein with his ideas about the transformation of matter and energy, Adam Smith with the ideas that the selfish motives of an individual lead to the common good, Sergey Brin and Larry Page with the idea to rank search results based on links from other sites - they all caused a revision of the established concepts and attitudes, and used this as a powerful lever to change the respective systems.

1) Expanding the boundaries of the worldview

A person should not be a slave to theories and ideas, but should remain flexible and free. No theory can claim absolute truth.
If all representations are incorrect to one degree or another, then any of them can be chosen that allows you to achieve your goal.

And in conclusion, another interesting idea from the book.
By itself, systems thinking will not take the step from understanding to action, but it will get the most out of analysis, and then point out what a person can and should do.

In custody

The ABC of Systems Thinking helped me find a clear and understandable explanation of what analytics is in a company. In fact, this is a feedback loop on the actions taken by the company.

Donella Meadows. The ABC of Systems Thinking

Anyone who reads the ABC of Systems Thinking will notice the appalling ignorance and incompetence of our politicians, economists and managers in dealing with complex dynamic systems. Donella Meadows' book shows how you can achieve sustainable and beneficial results for all. Only by overcoming ignorance can we reach a better future.

It is curious that among his colleagues in the preface to the book, Donell mentions Peter Senge (the author of the Fifth Discipline) ...

One of the objectives of the book is to teach readers to understand the basic behavior of complex systems so that they can be successfully interacted with.

Introduction: Systemic Perspective

Decision makers face not with separate, independent problems, but with an ever-changing situation in which complex combinations of changing problems interact and influence each other. I call it clutter... Decision makers don't solve problems, they just manage clutter.
Russell Ackoff, Control Theorist

Why does a Slinky spring (as on the cover) oscillate up and down? The answer lies in the design of the toy. Hands only suppress or release behavior, intrinsic to the spring.

The essence of the systems approach. Once the relationship between structure and behavior can be established, we begin to understand how systems work, why they produce certain results, and how to change behavior to achieve better results. Systems thinking allows you to identify the real causes of problems and find ways to solve them.

The response of the system to external influences is primarily a property of the system itself. The system itself determines its behavior in the long run. External influences can release and activate the behavior of a system, but the same external influence applied to another system is likely to lead to completely different results. For example, companies rarely lose market share to competitors. Those, of course, will benefit from this, but the reasons for the losses lie (at least in part) in the business policy of the company itself.

We were all taught to analyze, to use rational thinking, to establish direct connections between cause and effect, to learn new things in small portions that are easy to understand. We were also taught that problems can be solved by taking concrete action, that the world around us can be controlled. Such training allows us to gain individual and public power, and it also leads us to blame all the problems of presidents on competing companies in the market, organizations like OPEC ...


Some system archetypes:

· Since feedback is delayed in complex systems, by the time the problem becomes apparent, it is already much more difficult to resolve. - Problems are growing like a snowball.

· According to the principle of competitive exclusion, if a reinforcing feedback loop rewards the winner of a competition with further gains, then sooner or later most competitors will be eliminated from the competition. The rich get richer and the poor get poorer.

· Diverse systems with a large number of connections and backup cycles show greater stability and less exposure to external influences than homogeneous systems with little diversity. - Don't put all your eggs in one basket.

Psychologically and politically, we tend to look for the cause of problems, first of all, outside, not inside. It is almost impossible to resist the temptation to blame someone or something outside, because this allows you to shift the responsibility to someone else. Then we only need to find the cherished control button, take a magic pill or tablet, create the desired type of product - that is, find a technical means of eliminating the problem. We are accustomed to solving serious problems by focusing on external factors… Improvement can only be achieved if people start using their intuition, stop looking for blame, understand that the source of problems is systemic, and dare to change structure.

Systems thinking and systems approach allows us to use intuition to:

develop the ability to understand the constituent parts of systems;

Capturing relationships

Asking “What if...?” questions and analyze the future behavior of systems,

be able and not be afraid to change the structure of the system.

And then we will be able to change ourselves and the world around us.

Chapter 1. System structures and behavior. Basics

A system is a set of elements interconnected and acting in concert to achieve a specific goal. Systems are built on three essential components: elements, relationships, and purpose (or purpose).

You think that if you know what "one" is, then you know what "two" is, because one and one will be two. But you forget that you must understand what "and" is.

Sufi parable

Many interconnections in systems are realized through information flows. Information binds the system into a single whole and largely determines its behavior. The best way to establish the purpose of a system is to observe its behavior for a while.

Usually the system remains itself and changes very slowly (if at all), despite the complete renewal of its elements - as long as the goals of the system and the structure of relationships are preserved. But if the relationship changes, then the system can undergo significant changes.

The least obvious part of the system - its purpose or purpose - has a decisive influence on the behavior of the system.

Changing the goal can completely transform the system, even if all its elements and relationships remain in place. Elements - those parts of the system that are easiest for us to notice - most often (though not always) have the least impact on distinctive features systems. But only if changing the element does not change the relationship or goal.

A stock (or level) is something that is available in a certain quantity, accumulated over a certain period of time, stored in material form or in the form of information. Stocks and levels reflect the chronology of changes in flows in the system. The levels change over time as a result of the work of the threads. Streams can be incoming - increasing the level, or outgoing - downgrading.

How to read flowcharts. In this book, stocks or levels are represented by rectangles, and flows are represented by "pipelines" with arrows leading to or from the rectangle. Each piping depicts a "valve" that can either be opened more or less to regulate flow, or kept fully open or fully closed. The "clouds" at the beginning and end of the scheme symbolize the source and sink of the corresponding flow, their physical meaning we don't care.

If the sum of all output flows is equal to the sum of all input flows, the stock level or quantity will be unchanged; in such cases, a dynamic equilibrium is established at the level that was observed at the moment when the flows were equal. People tend to pay attention primarily to stocks, not to flows. And if we already pay attention to the flows, then first of all to the incoming ones, and only then to the outgoing ones.

Similarly, companies can maintain a large workforce by either hiring more new employees or taking steps to ensure that the old ones don't quit (and there's nothing to fire them for). Moreover, the cost of these two strategies can be very different.

Changing stocks and levels takes time. To be effective, the threads must operate for some time. This is the key to understanding the behavior of systems. Stocks usually change slowly, even if the input and output flows change very sharply. This leads to delays and serves as a kind of buffer in the system, softening external influences.

The lags caused in systems by slow stock changes can cause problems, but can also contribute to system stability. If you have an idea about the rate of change of stocks, you will not expect rapid movements where they cannot be fast in principle. And don't quit ahead of time. Inventory performs another very important function in the system, and it will lead us straight to the concept of feedback. Having inventory allows inputs and outputs to exist independently. For a while, the system may allow these flows to not balance each other.

The value of stocks is monitored constantly, and based on these data, decisions and measures are taken to increase or decrease stocks or to maintain them within certain limits. Systems dynamics scientists think of the world as a collection of stocks with mechanisms that regulate their levels by controlling flows. Systems thinkers view the world as a collection of feedbacks.

Demonstrating a certain type of behavior for an extended period of time is the first sign that there is feedback in the system. Feedback loops can keep inventory within certain limits, cause it to rise or fall. In any case, the flows leading to or out of the stock vary depending on the size of the stock itself.

A feedback loop is a chain of cause-and-effect relationships that originates from the stock and returns to it. Relationships are implemented through a set of decisions, rules, physical laws or actions that depend on the size of the stock itself. A change in stock causes a change in flow, which in turn causes a further change in stock, and so on.

Feedback loops that stabilize the stock at some level, allowing it to be adjusted and reach the desired value, are called balancing feedback loops. Inside such a cycle, the letter “B” is placed on the diagram. Balancing cycles tend to achieve some value, to stabilize.

Balancing feedback loops serve as leveling structures in the system, allow you to achieve the desired value, perform the functions of both a source of stability and resistance to change.

Reinforcing feedback loops spin themselves, causing the system to grow exponentially or even to run out of bounds. They occur in systems whenever a stock has the ability to reproduce itself or some part of itself.

Reinforcing feedbacks are very common, so it's useful to know one characteristic of them: doubling time - the time it takes for exponential growth to double the stock - is approximately 70 times the rate of growth expressed as a percentage. For example, if you put $100 in the bank at 7% per annum, then the amount will double in 10 years (70 / 7 = 10). If the bank rate is only 5%, then it will take 14 years to double the amount in the account.

If you start noticing the action of feedbacks everywhere, then you are turning into a systems thinker. Instead of noticing only how A causes B, you will start to wonder if there is also an influence of B on A and whether A strengthens (or weakens) itself. And when they hear on the radio that the Federal Reserve Bank is taking some action to regulate the economy, you immediately conclude that the economy has somehow affected the Federal Reserve Bank. And when someone tells you that population growth causes poverty to spread, you ask yourself, "Can't poverty lead to population growth?"

Instead of looking for someone to blame, you ask yourself, "What is this system?" The concept of feedback leads us to the realization that a system can be the cause of its own behavior.

Chapter 2. A Brief Overview of Different Types of Systems

Systems with one stock. Reserve and two balancing feedback loops (cycles compete with each other) - this is how heaters with a thermostat work.

The information received through feedback can only influence future, upcoming behavior; within the system, information spreads with a delay, and the impact cannot be so fast as to instantly correct the behavior that caused the current feedback. The feedback decision maker cannot change the current behavior of the system that caused the feedback; all decisions made will only affect her behavior in the future. Dominance is a very important concept in systems thinking. If one cycle dominates the other, it determines the behavior of the system to a greater extent. System Dynamics Analysis is not designed to predict what will happen. It allows you to find out what can happen if certain driving forces behave one way or another.

The lag in the balancing feedback loop causes the system to oscillate. Delays and delays can have a very strong effect on systems - in many ways they determine the type of behavior of systems.

System with two stocks. A non-renewable reserve prevents the use of a renewable reserve: the economics of the oil industry. Any physically growing system will sooner or later encounter one or another type of limitation. Such a constraint will play the role of a balancing feedback loop that will somehow reverse the dominance of the reinforcing cycle responsible for growth, either by increasing output flows or by decreasing input flows to the system. Growth in an environment that imposes its own limitations is very common. So common that systems thinkers call it the "limits to growth" archetype.

The use of renewable stocks is hindered by the limitations of renewable stocks themselves: the economics of the fishing industry.

Non-renewable resources are limited by reserves. Renewable resources are limited by the rate of production.

Chapter 3 Why Systems Are So Effective

It is important to be able to distinguish three main qualities inherent in systems: resistance to external influences, the ability to self-organize and a hierarchical structure.

Resistance to external influences. Resistance to external influences - the ability to restore its shape, return to its original position and state after external influences. The ability to withstand external influences arises due to complex structure numerous feedbacks.

Populations and ecosystems also have the ability to "learn" and evolve from their incredibly rich genetic diversity. Stability is not synonymous with immobility or permanence. Immutable, time-constant systems, on the other hand, can be very fragile. Static stability can be seen. Its parameters can be measured at any time. Elasticity and the ability to endure external influences are extremely difficult to discern. Because sustainability is not obvious (unless you are using a systems approach), people often neglect it and strive to achieve visible stability, performance, or other easily recognizable characteristics and qualities of a system.

Large organizations of every kind, from corporations to governments, become unsustainable simply because the feedback mechanisms by which they receive information and respond to environmental conditions must overcome too many successive delays and distortions.

Systems need to be managed with more than just performance or stability in mind. It is necessary to maintain their stability and resilience - the ability to withstand external influences and successfully recover from them.

The ability of systems to complicate their own structure is called self-organization. The ability to self-organize is often traded off in favor of short-term gains in productivity and stability, just as stability is. Productivity and stability are the most frequent arguments for turning people, beings of inherent talent and creativity, into primitive mechanical appendages to production processes. These same motives underlie bureaucratic systems and management theories that operate on people as if they were soulless units.

The ability to self-organize generates heterogeneity and unpredictability. It can grow new structures, create new ways of being and activities. Many governments are very reluctant to see their populations organize themselves. Sometimes they try to ban self-organization, hiding behind the name of law and order, and then long periods of stagnation and grayness set in, ruthless to any creative undertaking. Systems theorists used to believe that self-organization is such a complex property of systems that it is unknowable in principle. But time has passed, and new discoveries have shown that a few simple principles of organization are enough to get the widest variety of self-organizing structures.

The length of the sides of the Koch snowflake can be increased to infinity, but the "snowflake" limits the finite area. This figure is one of the simplest examples of fractals, self-similar objects.

When new structures are created in self-organizing systems and complexity increases, very often there is a hierarchical subordination, a hierarchy. Hierarchy is a remarkable invention in the systems world, not only because it makes systems stable and resilient, but also because it reduces the amount of information that each part of the system must constantly store and keep track of. In hierarchical systems, the relationships within each subsystem are closer and stronger than the relationships between subsystems.

The original purpose of any hierarchy is to help the subsystems that created it perform better. Unfortunately, by the time the hierarchy is well developed, this original goal is quite often completely forgotten by both upper and lower levels. Hierarchies begin to work inappropriately; it is for this reason that many systems never achieve their goals.

If the cells of the body stop performing their functions within the hierarchy and begin to divide uncontrollably, we call it cancer. If students believe that their main task is to get good grades (and not knowledge!), Then wholesale cheating begins, the use of cheat sheets, leading to the opposite results. If the interests of a subsystem are achieved at the expense of the interests of the system as a whole, this behavior is called suboptimization. Not only sub-optimization, but also excessive control, to the limit of centralized control, can harm the system. There are many examples in the economy of excessive control from the center, whether it concerns individual enterprises or entire countries.

Chapter 4 Why Systems Behave So Unexpectedly

The behavior of even the most simple systems might puzzle you. The fact that systems behave unexpectedly characterizes not only systems, but also ourselves. Comparing what I know about the real world with what I know (or think I know) about dynamical systems always shows that our level of knowledge should not be overestimated. Most people never expect how fast growth exponential growth can cause. And few people can intuitively grasp how to dampen oscillations in a complex system.

Everything we think we know about the world is a model. Our models correlate very well with reality, but at the same time they are far from representing the world in its entirety.

Dynamical systems often behave unexpectedly. This is a consequence of the fact that our mental models fail and cannot accurately describe the real world. It is necessary to know what false boundaries and bounded rationality are, do not forget about limiting factors, non-linear dependencies and delays. If you do not take into account the key properties of systems - stability, self-organization and hierarchical structure - then their structure and behavior will be misinterpreted, and it will become impossible to successfully interact with them.

Systems can mislead us by the way they present themselves (or we are "happy to be deceived ourselves") - in the form of a sequence of events. Events are the visible part of the iceberg, and not the most important one. We tend to be less surprised if a certain sequence, a dynamic type of behavior, can be caught in events. The behavior of the system determines its characteristics over time - growth, stagnation, decline, fluctuations, random fluctuations, evolutionary changes. When a systems thinker discovers a problem, the first thing they do is collect data about the system's history, including graphs of its behavior over time. Behavior over a long period allows you to get close to the structure of the system underlying this behavior. And structure, in turn, is the key to understanding not only what is happening, but also why. The structure of the system is a combination of stocks, flows and feedbacks. The structure defines what behavior is inherent in the system. A balancing feedback loop that seeks to achieve a specific goal contributes to dynamic equilibrium and then maintains it. The reinforcing cycle generates exponential growth. Linked together, these cycles can show growth, decline, and balance. If, in addition, they contain delays, then oscillations can also occur.

Most of the analytical reviews in the world are devoted to events, despite the fact that this is a very superficial approach. Such explanations make it impossible to predict what will happen next. Based on them, it is impossible to change the behavior of the system. Economic analysts sometimes go one level deeper, to the behavior of a system over time. Econometric models try to find statistically significant relationships between past trends. Behavior-based models are more useful than event-based models, but they also have fundamental drawbacks. First, they tend to overestimate systemic flows and underestimate stocks. Secondly (and this is a more serious drawback), in trying to determine the statistical relationships between flows, econometricians are looking for something that does not really exist. There is no reason to believe that one thread has any persistent relationship with any other thread. Flows increase and decrease, arise and dry up, and in various combinations, and this happens depending on the values ​​of stocks, and not other flows. Behavior-based econometric models are well suited for short-term forecasting in the economy, but they are completely unsuitable for long-term forecasts. And in matters of how to improve the state of the economy, these models do not make any sense at all.

This is another reason that the behavior of systems is often unexpected for us. The current events absorb all our attention. We do not study their history, and we do not have enough experience and knowledge to move from history to the structure of the system. But it is she who determines the behavior of the system and the sequence of events.

Linear thinking in a non-linear world. There are a lot of non-linear dependencies in the world. Our habitual thinking is linear, which is why we run into so many surprises. Non-linearities are important not only because such connections between action and response do not correspond to our expectations. They are important primarily because they change the relative power of feedback loops. They can force the system to switch from one kind of behavior to another.

non-existent borders. There are no separate, isolated systems. The world is continuous. Where to draw an artificial boundary around the system depends on what our goal is - what questions we need to answer. There is no one, once and for all, definite border around the system. We have to invent them in order for the model to be intelligible and adequate. If we forget that we artificially erected these boundaries ourselves, big problems can arise. Ideally, for each new problem, we need to re-search for suitable boundaries, and this requires a certain flexibility of thinking. Few people have it. We are bound to mental boundaries that we once chose and have become accustomed to.

Limit levels. The Law of the Limiting Factor: At any given time, the most important input to a system is the one that has the strongest limiting effect. Any growth narrows or expands the limits and thereby changes the essence of the limitation: another factor becomes the limiting one. A real understanding of growth is to shift attention from factors that are in abundance to those that may not be enough, that is, those factors that will become limiting in the future. Only then can you truly manage the growth process. Growth will always have limits. They can be internal, but if they are not, then the system will install them.

Omnipresent delays. Delays are present everywhere, in all systems. Any stock is necessarily associated with a delay. Most flows have delays: delivery delays, perception delays, production delays, development delays. If there are long delays in feedback loops, then the ability to anticipate is necessary to control the system. By the time the problem becomes apparent, the main opportunities to solve it will have already been missed.

limited rationality. The theory of bounded rationality assumes that people accept completely rational decisions, but based only on the information that is available to them at the moment. Perfectly complete information does not exist, especially about remote parts of the system. The fishermen do not know how many fish are left, much less how many fish other fishermen will catch on the same day. Entrepreneurs cannot know what other entrepreneurs plan to invest in, what buyers want to buy, how competitive the products will be.

Instead of finding the optimal solution in the long term, we choose from a rather limited list of short-term solutions and stick to this tactic stubbornly. Only a completely deadlocked situation can force us to change behavior. We misjudge risk by believing that something is too dangerous, when in fact the danger is exaggerated, but at the same time we neglect the real danger. Our attention is engrossed in the present; too much attention is given to recent events and too little to the past. The theory of bounded rationality challenged the political economy of Adam Smith, which had dominated for two hundred years.

To make a difference, one must first go beyond the information available at a given point and gain a general understanding of the system as a whole. It is simply amazing how quickly and easily behavior changes, if at least a little bit push bounded rationality at the expense of more complete and operational information. The bounded rationality of each participant in the system can lead to decisions that are not at all favorable for the system as a whole.

Chapter 5. System Pitfalls and Opportunities

To make complex systems less of a puzzle, we must learn to define their behavior, appreciate and use the complexity of the world. The systemic structures that generate commonly encountered problem behaviors are what we call archetypes. Simply understanding the structure of the archetypes that generate problematic behavior is not enough. It is absolutely useless to try to drive them into some kind of framework, it is necessary to change their structure. The blame for the destruction to which they can lead is often placed on individual participants in the system or some events, but in fact all this is a consequence of the very structure of the system.

Resistance to external influence: unsuccessful attempts to fix everything. Resistance to external influence stems from the limited rationality of the participants in the system, each of which pursues its own goals. One way to overcome resistance to external influence is to overcome it by force! Another way to overcome resistance to external influence is contrary to what the intuition says, so it usually does not even occur to anyone. Retreat. Stop the influence from the outside, because it still does not give a result. Redirect the forces and means of all sides from a military confrontation to something more important and useful. Third and most effective method to overcome resistance - in some way to align the goals in the subsystems: for example, to propose a common goal for all participants, allowing them to go beyond their own limited rationality.

Tragedy of the commons (community resources). In any such system, first of all, there is a resource that is in public use. After a certain critical point, the pattern will start to work: the less the resource remains, the less its ability to self-repair, and the sooner it will be completely destroyed. The tragedy of the commons occurs where the feedback from the resource is either very late or comes to the wrong point and does not limit the number of resource consumers. The tragedy of the commons can be avoided, and there are three ways:

educate and persuade;

Privatize a public resource; it must be divided in such a way that everyone gets the result from their own actions

Manage communities; governance can take many forms, from outright bans on certain activities to allocation of quotas, issuance of permits and licenses, imposition of taxes and economic incentives.

Striving for the worst. Not only are some systems resisting outside influence and in a perpetually bad state, their situation is getting worse. Examples are the loss of market share in business, the constant deterioration of the quality of services in hospitals, the increasing pollution of rivers and air, the increase in fatness in spite of all diets. good. When the current state changes, the best results are questioned and discarded, while the worst ones get stuck in memory. Reference points are not absolute. When the perceived state creeps down, then the goals become more modest. The worse the perceived state of the system, the lower the desired state becomes. The lower the desired state, the smaller the difference between the felt and the desired, the less active measures we take. The weaker our actions, the worse the state of the system. If left unchecked, this cycle will lead to a permanent deterioration of the system. Other names for such a system: "reducing expectations", "degradation of goals", "boiled frog syndrome". If the state of the system deteriorated at once, we would immediately react and take action. But the deterioration is so slow that we have time to forget or simply do not believe how much better it was before. Everyone is in a state of complacency, our expectations are getting smaller and smaller, the less effort we make and the worse the condition. There are only two means of escape from diminishing expectations. One of them is to have absolute points of reference independent of the perceived state. Another is to make expectations dependent on the best state in the past, not the worst.

Escalation of the conflict. "I'll hit back!" is a solution that leads to an escalation of the conflict. A reinforcing cycle works, in which each side tries to overpower the enemy. The target for each part of the system is not absolute (it cannot be precisely set, like the desired temperature in the room, at 18 ° C), it depends on the state of the other part of the system. The most famous and frightening examples are the arms race and those hot spots on the planet where irreconcilable enemies live in close proximity and are constantly on the verge of armed conflict. Another example is price wars, dumping: one of the competitors reduces prices, which forces the other to lower prices even more, so the first is forced to reduce the price again. Escalation is driven by a reinforcing feedback loop. It is exponential. One of the ways to get out of the escalation trap is for one of the parties to voluntarily disarm, to take a step back of their own free will, and then, after a while, the competitor will also retreat. From the point of view of ordinary logic, this solution is paradoxical. But in real life it can work if the retreating side acts with determination and is strong enough to endure a short period of time while the competitor has an advantage. There is another, more attractive way to stop the escalation: to agree on mutual disarmament. This leads to a change in the structure of the system, changes its structure. A new set of balancing control cycles is created, they do not allow the competition to go beyond certain limits.

Success to Success: Competitive Exclusion. This systemic trap occurs wherever the winner of a competition receives not only a reward, but also the opportunity to become even more competitive in subsequent competitions. If the competition is played in a limited space, where the reward for the winner is something that is taken from the losers, then sooner or later the losers will go bankrupt, or be forced out, or will live in poverty. "Success to success" is a well-known phenomenon in ecology, only there it is called the principle of competitive exclusion. This principle states that two different species living on the same resources cannot coexist in the same ecological niche. Two firms competing in the same market are like two species in the same ecological niche, their behavior will be the same. Some believe that the collapse of the USSR refutes the theory of Karl Marx, but in fact his claims that competition in the market ultimately leads to the absence of competition are confirmed wherever there is or was such competition. Species in nature and companies in the market sometimes manage to break out of the system of competitive exclusion through more diversified development. Companies may develop a new product or service that is not in direct competition with existing products and services. Markets tend to create monopolies, and ecological niches tend to the survival of a single species, but they also branch off and create diversity, new markets, new species. Over time, of course, they will also encounter competitors, and then the system will again strive for competitive exclusion. The obvious way out of the success-to-success archetype is to periodically bring everyone down to the same level. Tax legislation, for example, may provide for a progressive scale of taxation. Leveling mechanisms may stem from public morality, or they may be the result of a purely practical consideration: if the losers cannot continue the game due to the “success to success” trap, if they have no chance of winning, then in desperation they can destroy the entire playing field ...

Supporting means: addictions and manias. (In the domestic literature on management and doing business, the original name of this archetype is often translated as “problem substitution.” In fact, we are talking about the emergence of mania, a harmful dependence on various kinds of supportive drugs that reduce or mask symptoms, but do nothing for real solution problems.) In some systems, support facilities are really needed. But they can turn into a system trap. Let's say that the control feedback in the system does not cope with maintaining the desired state or does it not very efficiently. A well-chosen and effective support agent takes on some of the load. It allows the system to quickly come to the desired state. Everything is great, everyone is happy, everyone is happy. But then the original problem appears again, because no one has eliminated its cause. Therefore, the supporting agent has to be applied again, and already in a larger amount. The real state of the system is again masked, the problem is again not eliminated. And you have to resort to a support agent again and again. A trap occurs when a sustaining agent directly or indirectly undermines the original ability of the system to maintain its state. If this ability atrophies, then more and more support is needed to achieve the desired effect. This weakens the system's own ability even more and everything goes in circles again. Addiction provides a quick and fraught response to a symptom of a problem and prevents effective action being taken to eliminate the very cause and thereby truly solve the problem. Are insects threatening crops? Why reconsider farming practices and abandon monoculture crops, why study the destruction of natural mechanisms that used to control insect numbers, when you can simply apply pesticides? The locust invasion will be repulsed, it will be possible to expand the crops of monocultures, destroy natural ecosystems more. True, then there will be more insects, but we will again use pesticides, in even greater quantities. Problems can be avoided if only those means are used that increase the system's own ability to keep its state within the desired limits.

Rule manipulation. Wherever there are rules, there will be attempts to circumvent them. Manipulating the rules means that you are distorting their meaning - following the letter, but violating the spirit, neglecting what these rules were created for. Rules must be created taking into account the entire system, including those of its self-organizing parts that can evade the implementation of the rules. In a system, the rules are usually manipulated by the lower hierarchical levels, and often in response to too rigid, harmful, unsuitable, untenable, ill-defined rules from above. Rules should be formulated in such a way as to direct the possibilities of self-organization in the system in a positive direction.

Striving for the wrong goal. One of the most powerful ways to influence the behavior of a system is to change its purpose or purpose. The system, like a goldfish, can lead not to what you really wanted, but to what you said. If the quality of education is measured by the results of standardized tests, then the system will strive to provide the results of standardized tests. The grossest mistake of this kind is the adoption of GNP as an indicator of the success of the country's economy. GNP measures everything except what is truly worth living for.

Chapter 6

We have come to the question of how we should change the structure of systems so that they produce more of what we want and less of what we do not want. Any intelligent manager can convincingly state the essence of the problem, determine the system structure that causes it, and indicate quite precisely the area in which to look for points of influence - areas of the system where a small change in which can cause a significant change in the behavior of the system as a whole. Although people tend to intuitively know where to look for leverage, they often use it in the wrong direction. Key points and levers of influence are difficult to understand at the level of intuition. Let's move on to the list of impact points, compiled in order of increasing importance.

12. Numerical indicators: variables, constants, as well as subsidies, taxes and standards. Numerical characteristics and magnitudes of flows occupy the last, twelfth place in my list, since they lead to the smallest changes in the system or do not lead to them at all. This is about the same as rearranging the deck chairs on the Titanic in the hope that it will stop sinking from this. Perhaps, by 90%, no, by 95%, but rather even by 99%, our attention is focused on numerical parameters, but there are practically no key ones among them that could be used as leverage. For people, the size of taxes and the minimum wage is very important; around these indicators are constantly heated debate. But changing these parameters almost never leads to a change in the behavior of the country's economic system. If the system is in chronic stagnation, then changing the parameters cannot give it a starting impulse. If the system is tossed from one extreme to another, then changing the numerical parameters will not help stabilize it. If it grows unchecked, then numerical changes will not stop or even slow down this growth. Parameters become key only if they can seriously affect one of the higher items on my list.

11. Buffer: the size of the stock, which has a stabilizing effect, depending on the size of the flows. Stocks that are large relative to flows are much more stable than small stocks. Often a system can be stabilized simply by increasing the buffer size. But if the buffer is too large, the system loses its flexibility. She reacts too slowly. In addition, large buffers of some types, such as reservoirs or warehouses, require high construction and maintenance costs. Sometimes the key point - increasing or decreasing the size of the buffer - changes the behavior of the system as if by a wave magic wand. But the buffers themselves are usually large in physical size and not easy to change. That's why buffers are one of the last places on my list of key points.

10. Structures of reserves and flows: physical systems and their points of intersection. Pipeline system - stocks, flows and their mutual arrangement- can have a huge impact on the behavior of the system. The only way to correct the behavior of a poorly designed system is to change its structure. However, it often happens that a physical rebuild is the slowest and most expensive way to make changes to a system.

9. Lags: the amount of delay relative to the rates of change of the system. Delays in feedback loops have a decisive influence on the behavior of the system. If there are long delays in the system, it is basically unable to respond to short-term changes. That is why all central planning systems - both in the Soviet Union and in the factories of General Motors in the USA - always and everywhere work poorly. The lag in the feedback is very important in relation to the rates of change of the stock that is controlled by this feedback. Too little delay leads to an excessive, too sharp reaction - a kind of chasing one's own tail. Because of such a sharp response, the oscillations only increase. If the delay is greater, then the oscillations will be either uniform, or damped, or explosive, depending on how large it is. Too much delay in a system where there is a threshold value (maximum allowable level, point of no return, after which the system can collapse), will lead to the system going beyond the limits and catastrophe. It's usually easier to slow down the rate of change, and then the inevitable feedback lags won't be such a problem. This is why growth rates on our list are higher than lag rates. For the same reason, in Jay Forrester's World model, a slowdown in economic growth has a greater effect than an acceleration in technological development or complete price freedom in the market.

8. Balancing Feedback Loops: The strength of the loops relative to the impacts they are trying to offset. One of the worst mistakes that people sometimes make is to deprive the system of "emergency" response mechanisms. Say, they are rarely used and are expensive. In the short term, this may be without consequences. But on a long-term scale, we significantly narrow the range of conditions in which the system is able to survive. The strength of the balancing feedback loop is important relative to the magnitude of the impact that the loop is intended to correct. If the impact force increases, the feedback should also increase. A thermostatically controlled system may work well even on a cold winter day, but try opening the windows and the heater will not be powerful enough to compensate.

7. Reinforcing feedback loops: the effect of growth on driving loops. Reinforcing feedback loops are sources of growth, outbreaks, erosion, and disruption in systems. A system that has an uncontrolled reinforcing cycle will eventually destroy itself. That's why there are so few of them. Usually, sooner or later, the reinforcing cycle will be taken over by the balancing cycle. Reducing growth in a reinforcing cycle means slowing growth, and is often more powerful leverage in the system than trying to strengthen balancing cycles. And, in any case, such a restriction is much more preferable than letting everything take its course and allowing the reinforcing cycle to unwind further.

6. Information flows: a structure that determines who has access to information and who does not. Remember counters in Holland? I really like this story as an example of strong leverage in the information structure of the system. Here the parameter is not corrected, the existing feedback is not strengthened or weakened, but a new cycle is created that provides feedback where it was not there before. Lack of information flow is one of the most common causes of poor system performance. Adding or restoring information can be a powerful way to make an impact, and it's often easier and cheaper to do so than tampering with the physical infrastructure. It is very important that when restoring feedback, the information gets to the right place and in an explicit form. There is a distinct tendency among a certain section of humanity to avoid taking responsibility for their own decisions. That's why systems so often lack feedback loops, that's why these touchpoints are often very popular with ordinary people and are not popular with the authorities, and that is why they are so effective if you can either force the government to use them, or achieve the same bypassing the authorities.

5. Rules: incentives, punishments, coercion. The rules in the system determine its goals, boundaries, degrees of freedom. When Mikhail Gorbachev came to power in the USSR, he opened up information flows (announced the policy of glasnost) and changed the economic rules (proclaimed perestroika). As a result, the country has changed literally beyond recognition. To demonstrate to my students the power of rules, I usually ask them to come up with alternative rules for schools. When we try to imagine the changed rules and what our behavior will be like, we begin to understand their meaning. Rules are very strong points of influence. And the power over the rules is very big power. If you need to get to the bottom of the causes of system failures, analyze the rules and find out who has power over them.

4. Self-organization: adding, changing and evolutionary transformations of system structures. One of the most amazing properties possessed by living and some social systems, is the ability to radically change oneself by creating completely new structures and types of behavior. In biological systems, this is called evolution. In economics, this may be called technological progress or social revolution. Systems specialists have a term for this: self-organization. Self-organization means changing any key point already listed in our list: it can be the addition of completely new physical structures (from wings and brains to computers), the addition of new balancing or reinforcing cycles, the addition of new rules ... The ability to self-organize is one of the most powerful manifestations flexibility and stability of the system. A system capable of evolution can survive almost any change, because it can change itself. Self-organization actually supplies evolution with the raw material of construction - an unusually diverse store of information from which possible options are selected. And at the same time, it also serves as a means of experimentation to select and test these options. If you understand the power of self-organization, you will surely understand why biologists value biodiversity even more than economists admire technology and progress. An extremely diverse stock of DNA, created and accumulated over billions of years, is the basis of evolutionary potential, just as scientific libraries, laboratories and universities that train scientists serve as sources of technological potential. Unfortunately, people appreciate the evolutionary potential of cultures even less than they understand the value of every genetic variation in the world's gopher population. Perhaps the reason lies in the fact that in almost every culture there is a belief in the superiority of their own culture over others. The dominance of a single culture stops learning and drastically reduces sustainability. Any system (biological, economic or social) will sooner or later disappear from the face of our rapidly changing planet, if it allows itself to stiffen, stop evolving, experimenting, if it neglects opportunities for the development of something new. There is only one solution in this situation - it is obvious, but not popular. Encouraging variety, change, and experimentation is usually seen as condoning disorder and losing control. Let thousands of flowers bloom at the same time - and then this will begin! Who will allow this? Let's get more concerned about security and push the lever in the wrong direction, diligently destroying biological, cultural, social and market diversity.

3. Goals: purpose and function of systems. The very desire to destroy diversity in the hope of establishing control shows why the purpose of the system is a more significant point of impact than the ability to self-organize. If the goal is to bring the world more and more under one central planning system (the empire of Genghis Khan, the Church, the People's Republic of China, Wal-Mart, Disney World), then everything below our list, all physical stocks and flows , feedback loops, information flows, even self-organizing behavior - everything will work for such a goal.

2. The system of views and concepts: the worldview within which the system is built - its goals, structure, rules, delays and other parameters. Ideas and concepts shared by the whole society, provisions and norms not recorded anywhere form a paradigm, a system of views characteristic of this society - a set of ideas and beliefs about how this world works. Established views and concepts are sources in systems. It is on them, on those accepted by all public performances the nature of reality is based on the goals of systems, information and physical flows, feedbacks, stocks. How can one change the paradigm, the existing system of views? We must focus on those deviations and failures that the old theories do not explain. We must continue to talk and act, openly and confidently, from the positions new theory. Promote people who profess a new paradigm to positions that provide publicity and power. Not to waste time on those who put a spoke in the wheel, but to work with those who are capable of change and with those who have an open mind - there are a lot of such people.

1. Expand the boundaries of the worldview. There is leverage that is even more powerful than changing attitudes and beliefs. It consists in the fact that a person should not be a slave to theories and ideas, but remain free and flexible. To realize that no theory can claim to be absolute truth, and that everything we know about the world is in fact only a small and extremely limited part of the vast and amazing Universe, which lies far beyond human understanding. So that he could feel in his gut that all ideas have limits, and that this idea itself also has certain limits, and that the very comprehension of this is an incredibly exciting experience. I know that I know nothing. Buddhists call this state of "ignorance" enlightenment.

The stronger the leverage, the more system will resist change - that's why society often gets rid of those who move it forward.

Chapter 7

People who grew up in industrial developed world When they learn about systems thinking, they sometimes go to extremes in their enthusiasm and make a big mistake, believing that in systems analysis, in the mutual linking of a mass of parameters, in accounting for complex influences, and in the use of powerful computers, lies the key to predicting and controlling the future. This error is caused by the fact that the paradigm of the industrial world assumes that prediction and control have a key ... Self-organizing, nonlinear systems with feedback are not predictable in principle. They cannot be controlled. It is impossible to accurately predict the future and fully prepare for it.

For those who are accustomed to consider themselves the rulers of the world, it is difficult to accept the uncertainty inherent in systems thinking. If you are unable to understand, predict and take control, then what else is left to do? Systems thinking leads to another conclusion - an obvious, simple one, which suggests itself, only the illusion of control must be abandoned. You can do a lot, and this “doing” has varieties. The future cannot be predicted, but it can be imagined and lovingly brought to life. Systems cannot be controlled, but they can be created and redesigned. We cannot rush forward and secure a completely predictable world without surprises, but we can expect these surprises, learn from them, and even profit from them. We cannot impose our will on the system. But we can listen to what the system itself tells us, and find a way through which its properties and our qualities together can bring something better into the world than our will alone can create.

Feel the rhythm of the system. You should start by studying the behavior of the system, because this focuses your attention on facts, not theories. Otherwise, you may fall victim to your own or others' delusions and misconceptions. It's amazing how many misconceptions can be. It is especially interesting to observe how various elements in the system change - in concert or not. Direct observation allows you to immediately abandon many superficial assumptions. Studying the behavior of a system forces one to resort to dynamic rather than static analysis. The chronology of changes in several variables will allow not only to clarify which elements are in the system, but also how they can be interconnected. Finally, studying the history of a system breaks the bad habit of defining a problem not by the current behavior of the system, but by the lack of our favorite remedy. Usually the solution is to "predict, take control, force". And no one pays attention to what the system is doing, and no one is interested in why ...

Bring your mental models into the light of God. You don't have to put your mental models on paper in the form of block diagrams and equations, although that would be helpful. It is quite enough to state their essence in verbal form, in the form of pictures or arrows showing what, in your opinion, is connected with what. The more you practice this, regardless of the form you choose, the clearer and more flexible your thinking will be, the faster you will be able to correct mistakes, and the easier it will be for you to come to terms with uncertainty. The flexibility of thinking is the willingness and ability to push the boundaries, the ability to notice that the system has changed behavior, the ability to change the structure of the system. In the world of flexible systems, flexible thinking is indispensable.

Respect, appreciate and spread the word. I suspect that most of the problems in systems are related precisely to the distortion, delay or lack of information. You have no idea how much better the system can work if you give it more complete, accurate and timely information. Information is power. Anyone who is hungry for power immediately absorbs this truth.

Use the right language and enrich it with system concepts. Fred Kofman wrote: "We don't really talk about what we see, but we only see what we can talk about." A society that constantly talks about productivity, but barely understands (and even less uses) the concepts of flexibility and sustainability, will become productive, but not flexible and not sustainable.

Pay attention to everything that matters, not just what can be counted. It seems that everything that can be calculated is much more important for us than what cannot be calculated. This means that quantity is more important to us than quality. If you pretend that a phenomenon does not exist just because it is difficult to quantify it, then the models will be completely wrong. People are endowed not only with the gift of mathematical calculation, but also with the ability to evaluate quality. If something is bad, don't be silent. And don't let the excuse that "if you can't define and measure it, then it doesn't deserve your attention" stop you.

Use a feedback strategy in systems where they exist. President Jimmy Carter had the rare ability to think in terms of feedback and develop strategies around it. At a time when US oil imports were very high, Carter proposed a fuel tax proportional to the share of imported oil in total fuel consumption. It is clear that a dynamic feedback system capable of adaptation cannot be controlled by a static, inflexible strategy. It is much easier and more efficient (and often an order of magnitude cheaper) to develop strategies that change depending on the state of the system.

Strive for the benefit of the system as a whole. Strive to improve the properties of the entire system as a whole - growth, stability, diversity, resilience and self-maintenance - no matter how easy these properties are to be measured numerically.

Listen to the wisdom of the system itself. Help and stimulate those forces and structures of the system that help it work itself. Note; many of these forces and structures belong to the lower level of the hierarchy. Do not interfere with the system thoughtlessly and do not violate its internal self-maintenance mechanisms.

Determine who is responsible for what in the system. Pay special attention to events that serve as a trigger in the system, as well as topics external influences, which determine which type of behavior from options the system chooses. Sometimes people get so caught up in blaming or trying to get external factors under control that they miss a much simpler solution: increase accountability within the system itself. Internal responsibility means that the system, through feedback, gives immediate and quick feedback on the consequences of decisions made, and in a form that decision makers cannot ignore. Internal responsibilities in systems may include, for example, the requirement that all cities and industries discharging polluted effluent have an outlet pipe upstream of the intake. Internal liability may mean that neither insurance companies nor the state budget do not have to pay medical expenses related to smoking. Or pay for the treatment after an accident for a motorcyclist who did not wear a crash helmet, or a car driver who did not wear a seat belt.

Don't stop there. Learn! Systems thinking has taught me to trust intuition more and less - apparent rationality. In addition, it showed that, no matter how much we study, we still need to be constantly prepared for the unexpected. If you do not know something, then do not deceive yourself and stop there - you need to study further. Using mistakes is a necessary condition for learning.

Long live complexity! Admittedly, the universe is very chaotic and disorderly. It is dynamic and non-linear. Only a part of our nature, and a newly emerged one, builds houses - concrete boxes, uses perfectly straight lines and even surfaces. And the other, more ancient part, intuitively knows that nature prefers fractals, in which any scale, from micro to macroscopic, shows a detailed and fascinating picture of the world. We can admire complexity and encourage self-organization, creative disorder, heterogeneity, and diversity.

Expand your time horizons. The wider the chosen time horizon, the greater the chance of survival. Generally speaking, from a strictly systemic point of view, there is no division into short-term and long-term perspectives. Phenomena with different time scales are simply nested into each other.

Don't limit yourself to your profession. Follow the system wherever it leads, no matter what your specialty is, what your textbooks say, or what you think you're good at. The system will inevitably violate the boundaries between areas of knowledge.

Don't be indifferent. We need to expand the boundaries of what we have to take care of. Don't be indifferent. Don't pass by.

Strive for the best. The most destructive of the systemic archetypes is called Worst Wish. As a result, the bar is lowered. Idealism is ridiculed. We know what to do if there is a desire for the worst in the system. Don't let bad news affect you more than good news. Hold the bar high. Systems thinking can only advise us to do so - it will not do it for us.

(No ratings yet) Book Review: Bodo Schaefer - Money or the ABC of Money Book synopsis: Edward de Bono - Six Thinking Hats

  • Summary of the book: George Clason - The richest ...
  • Published with permission from Chelsea Green Publishing

    Scientific editor Alvina Savkina


    All rights reserved.

    No part of this book may be reproduced in any form without the written permission of the copyright holders.


    © 2008 by Sustainability Institute. All rights reserved

    Translation © 2018 by Mann, Ivanov and Ferber All rights reserved.

    © Translation into Russian, edition in Russian, design. LLC "Mann, Ivanov and Ferber", 2018

    * * *

    For Dana (1941–2001) and for all those who studied with her

    If the factory is razed to the ground, but the rationality that gave birth to it remains, then this rationality will simply create another factory. If a revolution destroys a systematic government, the systematic thought patterns that created that government remain intact, and these patterns will reproduce themselves in the next government. So much talk about the system. And so little is understood.

    Robert Pirsig, Zen and the Art of Motorcycle Maintenance

    Author's Preface

    At the heart of this book is the quintessence of wisdom, common sense, deep scientific knowledge and the experience of many people who have devoted more than thirty years of teaching and system modeling. Many of them worked in the System Dynamics Research Group at the Massachusetts Institute of Technology. First of all, this is Jay Forrester, it was he who created this group. My teachers (and students who became my teachers) were Ed Roberts, Jack Pugh, Dennis Meadows, Hartmut Bossel, Barry Richmond, Peter Senge, John Sterman and Peter Allen. The book contains thoughts, examples, quotations and information from the books of many representatives of the intellectual community. I express my admiration and gratitude to all its members.

    I have learned a lot from eminent scientists and thinkers. As far as I know, they never did computer simulations, but they all had natural, real systems thinking. These are Gregory Bateson, Kenneth Boulding, Herman Daly, Albert Einstein, Garrett Hardin, Vaclav Havel, Lewis Mumford, Gunnar Myrdal, E.F. Schumacher, a number of modern corporate executives, and many unnamed ancient sages, from Native American Indians to Middle Eastern Sufis. Strange company, isn't it? But systems thinking transcends disciplines, cultures, and historical eras.

    Systems analysts use comprehensive interdisciplinary concepts, but since all researchers are people with their own special traits, different schools of thought and directions have emerged. This book provides terms and notation used in system dynamics, the area closest to me. Here are the main provisions of systems theory, and not the latest scientific discoveries.

    I use analysis only when it helps to solve real problems, and do not resort to describing abstract theories. But if someday similar theories can be applied to the same purpose, someone might write a new paper.

    I must warn you that this book, however, like all others, cannot be impartial and exhaustive. Much more is known in the field of systems thinking than is presented here, but my main goal is to keep you interested. And I would also like you to understand the complex systems that we all deal with all the time, even if your introduction to systems and learning begins and ends with this book.

    Donella Meadows, 1993

    Foreword by the editor of the English edition

    In 1993, Donella (Dana) Meadows completed the first draft of the book you now hold in your hands. The manuscript was not published at the time, but was distributed unofficially for many years. Dana died unexpectedly in 2001 without finishing the book. Over the years that have passed since then, it has become clear that her work can still be useful and interesting to a wide range of readers. Dana was a scientist, writer and one of the best guides to the world of systems modeling.

    In 1972, the bestseller The Limits to Growth1
    Meadows D. H., Meadows J. L., Renders J., Behrens W. Limits to growth. M.: Publishing House of Moscow State University, 1991.

    Translated into many languages, with Dana as one of the main contributors.

    Its authors warned about the damage that non-optimal scenarios for the development of mankind could cause to the whole world if they were not stopped in time. They showed that the constant growth of population and consumption disrupts the ecological and social systems that support human life on Earth, they explained why the pursuit of unlimited economic growth will eventually destroy many local, regional and global systems. The warning of the authors was recognized as justified, the forecast - one of the most accurate. The conclusions of this book and its sequels are on the front pages of newspapers whenever oil prices rise, the climate changes dramatically, or when we face any other problems that 6.6 billion people create for themselves.

    Dana helped to understand why it is necessary to seriously revise the principles and methods of studying the world and its systems and start acting differently. Today, many recognize Yu that systems thinking is a very important tool in solving the environmental, political, social and economic problems that constantly arise before society. Systems, large or small, can behave very similarly. Understanding their behavior is our only hope that we will be able to change them at different levels in the long term. Dana wrote this book hoping to get her message across to a wider audience, which is why I and my colleagues at the Institute for Sustainable Development decided it was time to publish her manuscript posthumously.

    Can this book really help our world and every reader? I think yes. Perhaps you are an employee of a company (or its owner) and are striving to find a way to improve the world through your business or your organization. Or you are a politician who fails to bring good ideas and good intentions to life. Perhaps you are a decision maker important issues in the company and constantly facing problems. If you advocate change in systems such as society or the family in terms of what moral values ​​they advocate, then you know that just a couple of rash actions can undo years of consistent improvement. You may be saddened by how difficult it is to change our society for the better.

    If you are in a situation that is at least partly reminiscent of the above, I think this book will help you. Although there are other works on systems modeling and systems thinking. Many have a need for an accessible and inspiring book about systems and about us. About why we find them sometimes so confusing and how to learn to better manage and change them.

    Shortly before writing the first version of this book, Dana completed twenty years of work on the sequel to The Limits to Growth, which was published under the title Beyond the Limits. 2
    Meadows D. H., Meadows D. L., Renders J. Beyond growth. Moscow: Progress; Pangea, 1994.

    Dana was part of a research group in the field of nature conservation and environment, served on the Committee for scientific research at the National Geographic Society, taught systems dynamics, ethics, and lectured on the environment at Dartmouth College. She has always immersed herself in the events of today and viewed them as the result of the behavior of often quite complex systems.

    Dana's original manuscript has been revised and structured several times, but many of the examples in this book are from the first draft in 1993. They may seem a little dated to you, but while working on the editing, I decided to keep them because they are still relevant and instructive. The beginning of the 1990s is the time of the collapse Soviet Union and serious changes in other socialist countries. At the same time, the North American Free Trade Agreement was signed. The Iraqi army invaded Kuwait and then retreated, burning oil fields along the way. Nelson Mandela was released from prison and apartheid was abolished in South Africa. Trade union leader Lech Walesa was elected president of Poland, and writer Vaclav Havel was elected president of Czechoslovakia. The Intergovernmental Panel on Climate Change published its first report, which reported that "emissions from human activity significantly increase the concentration of" greenhouse gases "in the atmosphere, which leads to an increase in the greenhouse effect, that is, a global increase in temperature on the Earth's surface" . The United Nations Conference on Environment and Development was held in Rio de Janeiro.

    During one of her trips to the conference, Dana was reading an issue of the newspaper International Herald Tribune. In the materials published within one week, she found many examples of systems that need either better management or a complete reorganization. She read about them in a regular newspaper, because such systems are everywhere. As you begin to treat daily events as part of a general trend, which in turn reveals the internal structure of the system, you will see new ways to manage your life. It is my hope that this edition of Donella's book will give readers the ability to understand systems, reason about them, and change them for the better.

    I would like this short and accessible story to become a useful tool for you in a world that is in such need of change. This is a simple book for a complex world. It is for those who want to shape a better future on their own.

    Diana Wright, 2008

    Introduction. Looking Through the Lens of Systems Theory

    Managers do not face problems that are independent of each other, but constantly evolving situations that consist of a complex set of changing problems that influence each other. I call these situations messy... Leaders don't solve problems, they manage the mess.

    Russell Ackoff 3
    Russell Ackoff, The Future of Operational Research Is Past, Journal of the Operational Research Society 30, no. 2 (February 1979): 93–104.

    Management Specialist


    For one of my first systems classes, I usually bring a Slinky toy. If you forgot what it is, let me remind you: Slinky is a long, free spring that can oscillate, compressing and expanding up and down, “flow” from one hand to another, or “walk” along the steps of a ladder.

    I place the Slinky in my open palm. I clamp the upper coil of the spring with the fingers of the other hand, and then pull it back. The spring is first stretched, then compressed. It stretches again and contracts again. And so several times.

    "Why is the Slinky acting like this?" I ask students.

    “Because of the hand. You removed your hand,” they say.

    I take the toy box it came in and put the spiral inside. Then I also put the box in my palm, holding the spiral on top with my fingers. And in the most dramatic gesture, I remove my hand.

    Of course, nothing happens, the box just freezes.

    “Now again: why did the toy swing up and down?”

    Obviously, the answer must be sought in the device of the toy itself. It is precisely this behavior that is characteristic of her, and her hands only suppress it or do not prevent it from manifesting itself.

    This is important for understanding systems theory.

    If we understand how the structure and behavior of a system are interrelated, we can suggest how it works, why its behavior leads to certain results, how to use it more efficiently. As the world continues to change rapidly and become more complex, systems thinking will teach us to see the full range of possibilities, manage them and use them. Only this approach allows to identify the root causes of problems and find new solutions to them.

    So what is a system? The system is a set of some elements (elements can be any: people, cells, molecules) interconnected in such a way that over time their interaction begins to determine the behavior of the system. The system can experience shocks, limitations, triggers, as well as any other external forces. And the reaction, which in the real world is never simple, characterizes the system itself.

    What caused and how the Slinky's behavior manifests itself is easy to understand. When it comes to people, companies, cities or countries, the idea that the system largely causes its own behavior is like nonsense! An external event may trigger a certain behavior of the system, but when it affects another system, the same event is likely to lead to a different result.

    Think about what follows from this:


    Political leaders do not influence the recession or growth of the country's economy. Market fluctuations are embedded in the very structure of a market economy.

    Companies are not losing market share because of competitors. Naturally, competitors will take advantage, but the losing side's losses are due, at least in part, to its own business policies.

    Oil exporting countries are not solely responsible for rising oil prices. Their actions alone would not have been able to provoke such a dramatic change in prices and chaos in the economy, if the policy of oil consumption, pricing and investment of oil-importing countries did not lead to the creation of an economy so sensitive to supply delays.

    You are not attacked by the influenza virus, you yourself create favorable conditions for it in the body.

    The cause of drug addiction is not the weakness of a single person. Neither he nor anyone else, even the most loving, is able to help him. You can get rid of addiction only by realizing that it is a consequence of a whole complex of phenomena and social problems.


    Such statements will confuse someone, someone will seem like an ordinary manifestation of common sense. I believe that the two different types of reaction - unwillingness to accept system principles or acceptance of them - come from two different types of human experience and both are familiar to everyone.

    On the one hand, we were all taught to analyze, use rational methods, to track the direct connection between cause and effect, to comprehend an unknown field of knowledge in small and understandable “portions”, to solve problems, influencing the world around you. But it is precisely because of this approach that we blame presidents, rival companies, OPEC, influenza and drugs for their problems.

    On the other hand, we have all dealt with complex systems long before we learned to think rationally. After all, we ourselves are complex systems. Our body is a great example of complex, interconnected, self-sustaining systems. Every person, every organization, every animal, tree, garden or forest are complex systems. We realize this intuitively, without analyzing, without putting it into words. In essence, it is a practical understanding of how systems function and how to interact with them.

    Because modern systems theory is all about the use of computers and computation, we often simply don't notice that everyone understands its basic concepts at some level. Many postulates of systems theory can almost always be translated into the language of "folk wisdom".


    Feedback lags in complex systems mean that once a problem becomes apparent, it is often difficult to solve.

    - Road spoon to dinner.


    According to the principle of competitive exclusion, if the reward received by the winner of the competition becomes a means to achieve victories in further competitions through the action of a reinforcing feedback loop, in time almost all competitors will be eliminated from the competition.

    – For whoever has, to him it will be given, and whoever does not have, even what he has will be taken away from him (Gospel of Mark, 4:25).

    Money clings to money.


    Systems with a large variety of elements, many development scenarios and sidings are more stable and less vulnerable to external factors than homogeneous systems with little diversity.

    Don't put all your eggs in one basket.


    Since the Industrial Revolution, Western society has been oriented toward science, logic, and reductionism, while neglecting intuition and the holistic worldview. Psychologically, it is more convenient for us to see the problem outside: to blame something or someone, removing responsibility from ourselves. In this case, to eliminate the problem, it is enough just to find the control lever, technical solution, a suitable tablet and so on.

    Difficult tasks - preventing smallpox, increasing food production, the speed of moving heavy loads and large numbers of people over long distances - were usually solved by eliminating external causes. But since all of the above relate to the constituent parts of larger systems, some solutions only created additional problems. And those that are rooted in the internal structure of complex systems are practically unsolvable.

    Hunger, poverty, pollution, economic instability, unemployment, chronic diseases, drug addiction, wars. All this has been tried unsuccessfully to eradicate with the help of analytical and technical advances. Nobody creates these problems intentionally, nobody wants them to remain unresolved, but they continue to exist. Because these problems are systemic. The reason for the undesirable behavior of the system often lies in itself. Problems can only be solved when we use our intuition, stop blaming everyone around us, start seeing the system as the source of our problems, find the courage and wisdom to restructure system.

    This is obvious, but unusual. It is comforting that the decisions are in our hands. Although the need act, or at least watch on things and comprehend them otherwise, not in the way we are used to, can be alarming.

    This book talks about how to learn to understand the difference between what you see and how is that realize. It is intended for those who are skeptical about the concept of "system" and systems analysis (although we all use it in Everyday life). I have omitted a lot of technical details because I want to show that understanding systems can be achieved without resorting to mathematical formulas and without resorting to the help of a computer.

    I often use diagrams and graphs in this book, because it is difficult to talk about systems in just words. Words and sentences should follow each other in a linear logical sequence. Events in systems develop non-linearly, not in one direction, but in many at once. To properly describe and study them, we need a language that has the same properties as the phenomena we study.

    Graphs and diagrams are more informative than words, because all parts of the image can be seen at the same time. I'll start with the very simple ones and gradually build up the complexity. I am sure that you will easily understand this visual language.

    First, you will get acquainted with the basic concepts: what is a system and what does it consist of. We will consider the elements not in interaction - not from a holistic, but from a reductionist position. Then we bring them together again to demonstrate the basis of self-regulation and systems development - the feedback loop.

    After that, you will find yourself in the “systems zoo”, where a collection of common and interesting types of systems is presented. You will see how some of them behave and get to know their habitat. They are everywhere and even within you.

    Using some examples, I will show you how systems can work great and at the same time surprise and confuse us. You will find out why as a result of coordinated and rational actions of individual or most elements of the system, what is not at all what was expected is obtained. You will understand why these results can appear much earlier or later than the scheduled time, why, having repeated something that always gave a result, you suddenly find, to your great disappointment, that it no longer works, why the behavior of the system has changed in an unpredictable way.

    Discussing these “whys” will allow us to consider common questions that systems thinkers have to face again and again, problem solver arising in corporations, government structures, economics, ecosystems, physiology and psychology. We will consider the system of distribution of water resources between communities and financial resources between educational institutions and come to the conclusion that these are special cases of the tragedy of the commons 4
    The tragedy of the commons, or the tragedy of the commons, is a kind of phenomenon associated with the contradiction between personal interests and the public good. For example, farmers from the same community share a common pasture. If several pastoralists increase the number of livestock, the fertility of the field will not change. But if everyone does this, then the pasture will become impoverished and the members of the community will suffer losses. If everyone reduces the number of livestock, then the fertility of the field will increase. But his personal gain will be less than the lost income. It turns out that it is in the interests of the individual farmer to increase the herd all the time. Note. ed.

    Explore business rules and incentives that help or hinder the development of new technologies. Let us consider why there is resistance to the decisions of the authorities and traditional relationships in the family, community or country. We will see that there are many more reasons for addiction to caffeine, alcohol, nicotine and drugs than it seems at first glance.

    Systems specialists call these general structures, which manifest themselves in characteristic behavior, archetypes. In the first drafts of the book, I called them "system traps." Then she added: "... and opportunities," because even those archetypes that are responsible for the appearance of seemingly unsolvable potentially dangerous problems can be transformed to obtain the desired result. It is enough just to understand a little the principles of the functioning of systems.

    After that, I will move on to consider actions to restructure the systems in which we live. We will try to find points of influence on the systems, influencing which, it is possible to change the behavior of systems.

    The final part of the book contains a series of generalizing conclusions about systems made by many system modelers known to me. In the book you will find,,, from the first part, with which you can dive deeper into the topic of systems thinking.