The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is. —G. K. Chesterton,1 20th century writer
It’s one thing to under stand how to fix a system and quite another to wade in and fix it.
Social systems are the external manifestations of cultural thinking patterns and of profound human needs, emotions, strengths, and weaknesses. Changing them is not as simple as saying “now all change,” or of trusting that he who knows the good shall do the good.
Our systems insights helped us understand many things we hadn’t understood before, but they didn’t help us understand everything.
A systems insight . . . can raise more questions!
The tool of systems thinking, born out of engineering and mathematics, implemented in computers, drawn from a mechanistic mind-set and a quest for prediction and control, leads its practitioners, inexorably I believe, to confront the most deeply human mysteries.
Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way.
The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable.
The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best.
For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize.
Why do people actively sort and screen information the way they do? How do they determine what to let in and what to let bounce off, what to reckon with and what to ignore or disparage? How is it that, exposed to the same information, different people absorb different messages, and draw different conclusions?
Are they universal, or culturally determined? What causes a person or a society to give up on attaining “real values” and to settle for cheap substitutes? How can you key a feedback loop to qualities you can’t measure, rather than to quantities you can?
It says that there is plenty to do, of a different sort of “doing.” The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
We can’t control systems or figure them out. But we can dance with them!
Get the Beat of the System
Before you disturb the system in any way, watch how it behaves.
Learn its history. Ask people who’ve been around a long time to tell you what has happened.
Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others.
It’s especially interesting to watch how the various elements in the system do or do not vary together.
Starting with the behavior of the system directs one’s thoughts to dynamic, not static, analysis—not only to “What’s wrong?” but also to “How did we get there?” “What other behavior modes are possible?” “If we don’t change direction, where are we going to end up?” And looking to the strengths of the system, one can ask “What’s working well here?”
Starting with the history of several variables plotted together begins to suggest not only what elements are in the system, but how they might be interconnected.
Starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.
Expose Your Mental Models to the Light of Day
When we draw structural diagrams and then write equations, we are forced to make our assumptions visible and to express them with rigor.
Our models have to be complete, and they have to add up, and they have to be consistent.
Mental flexibility—the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure—is a necessity when you live in a world of flexible systems.
Everything you know, and everything everyone knows, is only a model.
Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them to be plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption that may become entangled with your own identity.
Getting models out into the light of day, making them as rigorous as possible, testing them against the evidence, and being willing to scuttle them if they are no longer supported is nothing more than practicing the scientific method—
Honor, Respect, and Distribute Information
Decision makers can’t respond to information they don’t have, can’t respond accurately to information that is inaccurate, and can’t respond in a timely way to information that is late. I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information.
You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
Information is power. Anyone interested in power grasps that idea very quickly.
Use Language with Care and Enrich It with Systems Concepts
Honoring information means above all avoiding language pollution—making the cleanest possible use we can of language. Second, it means expanding our language so we can talk about complexity.
Language can serve as a medium through which we create new understandings and new realities as we begin to talk about them. In fact, we don’t talk about what we see; we see only what we can talk about.
Our perspectives on the world depend on the interaction of our nervous system and our language—both act as filters through which we perceive our world.
The language and information systems of an organization are not an objective means of describing an outside reality—they fundamentally structure the perceptions and actions of its members.
To reshape the measurement and communication systems of a [society] is to reshape all potential interactions at the most fundamental level.
A society that talks incessantly about “productivity” but that hardly understands, much less uses, the word “resilience” is going to become productive and not resilient. A society that doesn’t understand or use the term “carrying capacity” will exceed its carrying capacity.
The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible—part of the job of keeping information streams clear. The second step is to enlarge language to make it consistent with our enlarged understanding of systems.
The industrial society is just beginning to have and use words for systems, because it is only beginning to pay attention to and use complexity. Carrying capacity, structure, diversity, and even system are old words that are coming to have richer and more precise meanings.
Pay Attention to What Is Important, Not Just What Is Quantifiable
If quantity forms the goals of our feedback loops, if quantity is the center of our attention and language and institutions, if we motivate ourselves, rate ourselves, and reward ourselves on our ability to produce quantity, then quantity will be the result.
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models.
Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector.
Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy. No one can define or measure justice, democracy, security, freedom, truth, or love
Make Feedback Policies for Feedback Systems
It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops—loops that alter, correct, and expand loops. These are policies that design learning into the management process.
Go for the Good of the Whole
Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole.
Don’t, as Kenneth Boulding once said, go to great trouble to optimize something that never should be done at all.
Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability—whether they are easily measured or not.
Listen to the Wisdom of the System
Aid and encourage the forces and structures that help the system run itself.
Don’t be an unthinking intervenor and destroy the system’s own self-maintenance capacities.
Before you charge in to make things better, pay attention to the value of what’s already there.
Locate Responsibility in the System
A guideline both for analysis and design.
In analysis, it means looking for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another.
Sometimes those outside events can be controlled but sometimes they can’t. Sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically responsible. He or she will experience directly the consequences of his or her decisions.
Designing a system for intrinsic responsibility could mean, for example, requiring all towns or companies that emit wastewater into a stream to place their intake pipes downstream from their outflow pipe.
A great deal of responsibility was lost when rulers who declared war were no longer expected to lead the troops into battle. Warfare became even more irresponsible when it became possible to push a button and cause tremendous damage at such a distance that the person pushing the button never even sees the damage.
How little our current culture has come to look for responsibility within the system that generates an action, and how poorly we design systems to experience the consequences of their actions.
Stay Humble— Stay a Learner
Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don’t know.
The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn.
In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives.
What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls “error-embracing.” It takes a lot of courage to embrace your errors.
Error-embracing is the condition for learning. It means seeking and using—and sharing—information about what went wrong with what you expected or hoped would go right. Both error embracing and living with high levels of uncertainty emphasize our personal as well as societal vulnerability. Typically we hide our vulnerabilities from ourselves as well as from others. But . . . to be the kind of person who truly accepts his responsibility . . . requires knowledge of and access to self far beyond that possessed by most people in this society.
The universe is messy. It is nonlinear, turbulent, and dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity and uniformity. That’s what makes the world interesting, that’s what makes it beautiful, and that’s what makes it work.
We can, and some of us do, celebrate and encourage self-organization, disorder, variety, and diversity. Some of us even make a moral code of doing so, as Aldo Leopold did with his land ethic: “A thing is right when it tends to preserve the integrity, stability, and beauty of the biotic community. It is wrong when it tends otherwise.”
Expand Time Horizons
One of the worst ideas humanity ever had was the interest rate, which led to the further ideas of payback periods and discount rates, all of which provide a rational, quantitative excuse for ignoring the long term. The official time horizon of industrial society doesn’t extend beyond what will happen after the next election or beyond the payback period of current investments.
The time horizon of most families still extends farther than that—through the lifetimes of children or grandchildren.
The longer the operant time horizon, the better the chances for survival.
Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.
The couplings between very fast processes and very slow ones are sometimes strong, sometimes weak. When the slow ones dominate, nothing seems to be happening; when the fast ones take over, things happen with breathtaking speed. Systems are always coupling and uncoupling the large and the small, the fast and the slow.
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term—the whole system.
Defy the Disciplines
Seeing systems whole requires more than being “interdisciplinary,” if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other.
Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode. They will have to admit ignorance and be willing to be taught, by each other and by the system.
Expand the Boundary of Caring
Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all, it means expanding the horizons of caring.
The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem
It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
Don’t Erode the Goal of Goodness
The most damaging example of the systems archetype called “drift to low performance” is the process by which modern industrial culture has eroded the goal of morality.
Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. This is just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are “not news.” They are exceptions. Must have been a saint. Can’t expect everyone to behave like that. And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
We know what to do about drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute.
Systems thinking can only tell us to do that. It can’t do it. We’re back to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap, but it can lead us to the edge of what analysis can do and then point beyond—to what can and must be done by the human spirit