QUOTE 13 from:
Lucas, C. (2001a) 'Complexity Theory: Actions for a Better World' Available from: http://www.calresco.org/action.htm#pro; accessed on 28/2/05
So What Is Complexity Theory Anyway ?
This scientific theory <lucas/quantify.htm> relates to what happens when rich interactions within a system (a dynamic collection of interconnected parts) allow it to self-organize <extropy.htm>, i.e. to 'do its own thing'. When this occurs complex structures are predicted to form which will continually evolve over time into new structures. Generally these processes are expected to generate hierarchies of novel system-maintained properties that cannot be predicted from studying the parts only. For this theory to apply, we usually expect the system to contain modular parts and autonomy, to be critically connected i.e. neither static (with disconnected parts) nor chaotic (with overconnected parts) but arriving at a state in-between (called 'edge of chaos <perturb.htm>'). The evolution <emerge.htm> can occur through random mutation, through internal learning or through selection by trial-and-error interaction with its environment (often all three <lucas/auto.htm>). The system will then be expected to generate many semi-stable states called attractors <attract.htm>, the current system state is defined by which attractor is active and future states by possible trajectories between available attractors. A considerable body of research backs up this new science (we overview this in our Introductions <themes.htm>, Online <papers.htm>, Related <related.htm> and Offline <offline.htm> papers, Links <links.htm>, Tutorial <tutorial.htm> and Applications <applicat.htm> pages). There are several thousand academic researchers worldwide, from all disciplines, informally pooling the systemic knowledge upon which the developing suggestions on this page are based. A Glossary <glossary.htm> is available for those unfamiliar with complex system terms.
How Does Complexity Theory Suggest We Act ?
To better approach this question let us start by outlining our understanding of the 11 basic laws of systems thinking proposed by Peter Senge in his book the The Fifth Discipline <http://www.ejmings.com/sengeweb1.html> (1990) followed by some complexity science additions of our own:
1. Today's problems come from yesterday's solutions
When we implement a simple one-dimensional solution (taking the 'obvious' answer), we usually create related problems in the other dimensions that are being ignored. All complex systems have multiple dimensions or attributes which interrelate and cannot be treated in isolation. The problem resurfaces in a different guise because we don't understand the real systemic cause, and have only tried to 'cure' one resultant effect, a 'symptom' based approach to systemic 'medicine'.
2. The harder you push, the harder the system pushes back
This relates to feedback paths, a necessary feature of all complex systems, which exhibit a multi-path circular causality rather than the linear separated cause/effect of conventional thinking. Thus, if we act, the effect will come back upon us, the stronger our action the more we will disturb all the system variables and the more unpredicted effects there are to return to us ! We can't act as if we are isolated outsiders since we ourselves are part of the system being changed.
3. Behaviour grows better before it grows worst
If we only measure the one dimension <lucas/paradox.htm> that we change, then we miss the growing problems caused by our inadequate 'solution'. We need to monitor all dimensions of the problem in order to detect interdependencies of which we are currently unaware. Failure to do this is like the 'ostrich syndrome', burying your head in the sand may seem to make things better - but only for a while !
4. The cure can be worse than the disease
Believing in our 'cure' we can mistakenly continue escalating the 'dose' if it seems not to be working, when we are by that act escalating the problem instead. This is a form of positive feedback and results from holding dogmatic beliefs in untested assumptions (based upon simpler systems) which actually do not apply in these complex systemic cases.
5. The easy way out usually leads back in
Failing to understand the system interconnections will allow unexpected side-effects to appear. Eliminating an obvious apparent 'problem' can also eliminate necessary features of the system, which depended upon some aspect of the parts removed, thus causing the system to become even worst !
6. Faster is slower
Trying to force the issue will not permit the system to settle to an attractor in its normal way, chaos will develop as unexpected parts of the system break down under the strain. Only a certain rate of change can be absorbed by humans, and we can only monitor a restricted number of variables effectively. Instant solutions just create more instant problems...
7. Cause and effect are not closely related in time and space
Systemic changes percolate in space throughout the system. This takes time, sometimes considerable time. Impatience prevents the consequences of the change being understood - or even detected over time once the 'perpetrator' moves on. The circular causality is not one to one, but multi-step chains <perturb.htm> of often gradually acting communications, before the system arrives at its revised steady state.
8. Small changes can produce big results - but the areas of highest leverage often the least obvious
The butterfly effect ensures that well targeted changes can escalate, so massive interactions are unnecessary. We just need to determine which variable(s) to change and leave time for the effects to occur. This requires knowing how the system is interconnected, and often proves counter-intuitive. As System Dynamics researcher Jay Forrester says:
"The intuitively obvious 'solutions' to social problems are apt to fall into one of several traps set by the character of complex systems. First, an attempt to relieve one set of symptoms may only create a new mode of system behavior that also has unpleasant consequences. Second, the attempt to produce short-term improvement often sets the stage for a long-term degradation. Third, the local goals of a part of a system often conflict with the objectives of the larger system. Fourth, people are often led to intervene at points in a system where little leverage exists and where effort and money have but slight effect."
World Dynamics <http://dieoff.org/page23.htm> (1971)
9. You can have your cake and eat it too - but not at once
Most good solutions are not of an either/or dualist sort, instead there are many solutions that can employ the both (cooperative) or neither (diversity) options which also exist in state space. Creative thinking, considering all aspects of the system (them as well as us), can help identify these wider ranging solutions, which often solve several problems at the same time.
10. Dividing an elephant in half does not produce two small elephants
Artificial divisions discard important interconnections, the whole cannot be reduced to independent simplified pieces in complex systems - only in simple aggregates. Attempting to do so destroys the integrity of the system, leading to massive losses of those synergistic <lucas/fitness.htm> properties (value-add) that the system associations have previously generated.
11. There is no blame
Because there is no 'cause' to blame, since all participants coevolve and are each both cause and effect. Blaming others blames yourself also. Constructive action not vengeance is required - there is only ONE system, the whole, which must be treated in its entirety if any 'solution' is to be a viable long term improvement.
Now our own additions:
12. Participants act autonomously
Humans have their own goals. Trying to force conformity by assuming all parts are identical destroys the value-add inherent in the 'system' concept. This 'control' stagnates the system and makes it ineffective and unresponsive to problems. Self-interest will always work to undermine such badly conceived systems.
13. As ye sow, so shall ye reap
Actions can have negative or positive effects on different system attributes. Each effect will grow with feedback, taking the whole system down their resultant paths. If this is negative when summated then the system will disintegrate over time (die), improvement can only result when feedback incorporates overall positive actions (e.g. love not hate).
14. Two rights make a wrong
The nonlinear <nonlin.htm> nature of complex interactions under feedback loops means that we cannot linearly add effects. At some stage each 'good' becomes a 'bad' as we exceed tolerance boundaries and start to degenerate the fitness of the whole (e.g. overeating). Compromise solutions are essential, rather than maximising any variable in isolation - which proves to be sub-optimal in complex systems.
15. One plus one equals five
The mathematics of zero-sum (1+1=2) does not apply in complex systems. Here we combine parts explicitly to gain advantages due to cooperative efficiency (division of labour). It is failure to recognize these existing emergent properties that invalidates reductionist 'part-based' solutions, which always ignore these inherent advantages since normal mathematics is inadequate to incorporate them !
16. The system controls us
The boundaries that constrain our actions are self-generated by our collective beliefs. We create the systems by which we live. We set the rules by which any social norms <lucas/normal.htm> arise. If they fail to achieve the desired results, then we can change them by simple belief alterations, by how we use words. The material world is irrelevant to this process, which is based upon abstract teleological considerations.
17. Change is always expected
Any complex system experiences fluctuations, both large and small. These are only problematical if our thinking is static, expecting a 'fixed world'. Dynamic systems require dynamic thinking, which accepts diversity and freedom as essential elements of a system that can absorb change without damage - and even benefit from it.
18. All levels drive the result
Considering the whole means not only including all the objectives of a particular level (e.g. company) but all the nested levels that create and sustain that system. These include material, environmental, primal, social and abstract needs <lucas/science.htm>. Each has many variables that can be affected by bad decisions, and as these levels are interdependent they are all necessary for the ongoing support of the whole and of the many interacting sub-systems that make it up.
19. It's a can of worms
Because many autonomous variables exist, attempts to centrally control systems by manipulation of a few selected variables are doomed to failure. The only way to prevent some 'worms' escaping is suppression by force - closing and sealing the 'can'. In other words, simple imposed answers to complex changes are not viable survival strategies. We must decentralise control instead