Article Categories
[ Show ] All [ Hide ]
Clean Language
Article Selections
[ Show ] All [ Hide ]
 

Self-Organising Complex-Adaptive Systems

Large Group Metaphor Process (at the Findhorn Community)

13-16 Januray 2003

James Lawley

This paper was written in preparation for the facilitation by the Clean Team* of a Large Group Metaphor Process at The Findhorn Community during 13-16 January, 2003. Details of the process follow.

* James Lawley, Penny Tompkins, Steve Callaghan, Wendy Sullivan, Phil Swallow, Caitlin Walker and Marian Way.

The aim of the paper is to raise awareness of:
  • the key features of self-organising systems
  • how these might manifest in a community setting
  • how this knowledge can inform the way we facilitate.

We recommend you simply read the following several times and absorb the background intelligence to be found, as David Grove would say, between the lines. Finally, remember that we are pointing to patterns of group interactions, NOT patterns in the behaviour of individuals.


General concepts

We want to approach the Findhorn project by thinking in terms of emergence, a fundamental feature of self-organising systems:

"They are bottom-up systems, not top-down. They get their smarts from below. In a more technical language, they are complex adaptive systems that display emergent behaviour. In these systems agents residing on one scale start producing behaviour that lies one scale above them: ants create colonies; urbanites create neighbourhoods; simple pattern-recognition software learns how to recommend new books. The movement from low-level rules to higher-level sophistication is what we call emergence." (Johnson, p. 18)
Many systems are complex. They have multiple agents dynamically interacting in multiple ways, following local rules and oblivious to any higher-level instructions. But they cannot be considered emergent until those local interactions result in some kind of discernible macro behaviour. Murray Gell-Mann has a good phrase for it: surface complexity arising out of deep simplicity.
"Turbulent flow in a liquid is a complex system, but it can't be called adaptive. There's information in the system, no question. But it doesn't produce a schema, a compression of information with which it can predict the environment. In biological evolution, experience of the past is compressed in the genetic message encoded in DNA. In the case of human societies, the schemata are institutions, customs, traditions, and myths. They are,in effect, kinds of cultural DNA. Complex adaptive systems are pattern seekers. They interact with the environment, 'learn' from the experience, and adapt as a result." (Gell-Mann, quoted in Johnson, p. 15)
The fundamental model of emergence and self-organisation can be represented as:



This process creates a recurring pattern and shape: a network of self-organization, of disparate agents that unwittingly create a higher-level order. This is an ongoing dynamic process in that the system rarely settles on a single, frozen shape; rather it forms patterns in time as well as space.

When we see repeated structure emerging out of apparent chaos, our first impulse is to build a centralised model to explain that behaviour. It's as if we can't help looking for non-existent 'decison-makers', 'controllers' or 'pacemakers' [cells that order the behaviour of other cells]. To not make sense of the world this way requires us to consciously attend to a different set of patterns.

'Success' [from the viewpoint of the self-organising system] is the ability to survive repeated selective pressures. The new worldview implicit in this theory is that solutions to a problem are an emergent property of the problem. In other words, obtaining solutions is inherently complex. Evolution, especially the focussed, purposeful form of 'trial and learning' is the primary way these systems change

Most complex systems exhibit what mathematicians call attractors, states to which the system eventually settles, depending on the properties of the system. Attractors resist perturbation: mutations don't propagate far. But when they do change, their options are limited to nearby attractors. Attractors are not static. They will change dynamically as the environment changes. The effect of the environment is important. It may diminish the stability of some attractors, and improve the stability of others.

All system exhibit one of four classes of behaviour (Wolfram):

Class 1. Fixed Point
Almost all initial conditions lead to exactly the same uniform final state.
Class 2. Periodic
There are many different possible final states, but all of them consist of a set of simple structures that remain the same or repeat.
Class 3. Chaotic
Seems random, although small scale structures are always seen at some level.
Class 4. Edge of Chaos

A mixture of order and randomness with localised structures which on their own are fairly simple but these move around and interact with one another in complicated ways.

Class 4 is intermediate between chaotic and periodic behaviour. It’s the most interesting because you get the manipulation of complex information there. It is where the system moves from one region to another, crossing a no-man’s-land, where chaos and stability pull in opposite directions. The switch from order to chaotic regimes in dynamical systems is analogous with phase transitions in physical systems, the switch from one state to another. Cell membranes are poised between a solid and a liquid state.

But without doubt the crowning achievement of a complex adaptive system with class 4 behaviour is not only that it moves towards the edge of chaos, but that it also hones the efficiency of its rules as it goes. By tuning their interactions, species effectively are honing their ability to evolve. In the midst of all this activity, they improve their evolvability. As if by an invisible hand they achieve maximum average fitness dynamically. The collective good is ensured over time. Sometimes this is referred to as the ‘arrow of evolution’.

Goals and purposes play a different role depending on the class of behaviour exhibited by the system:

Class 1. Goals are irrelevant as the system always ends at the same fixed point.

Class 2. Goals are useful and desirable because the system is predictable.

Class 3. No goals are possible because the system is completely unpredictable.

Class 4. Rather than fixed goals, better to have dynamic reference points which adjust to the ever-changing circumstances living on the edge of chaos.

The co-evolutionary model builds in connections between species in the ecosystem. The system gets tuned as it moves itself toward the edge of chaos. Connectedness is required if the ecosystem is to work as a whole, not just as independent entities. And it is required if perturbations are to cascade through the system, producing avalanches of new speciation and extinctions.

"Co-evolving communities act in concert as a result of the dynamics of the system; they do so as a result of individuals within the community myopically optimizing their own ends and not as collective agreement towards a common goal; and the communities really do come to know their world in a way that was quite unpredictable before the science of Complexity began to illuminate that world." (Lewin p.188)

"The old view of the world of nature was that it hovered around simple equilibria. The science of Complexity says that’s not true. Biological systems are dynamical, not easily predicted, and are creative in many ways. The science of Complexity makes you view the world as creative." (Lewin p.190)

“Given the usual richness of our complex world, you might well expect that altering the rules would have some kind of effects on the way this game works, but in fact there seems to be just one. Everything that physicists have discovered indicates that no matter how you bend the rules, there is always a sharp tipping point. [This determines] how far an ‘infection’ tends to spread, if just above the tipping point, and how quickly it tends to die out, if just below that point. It is as if the details of how influences spread have no ultimate effect on how things work. Consequently, even though we know very little, perhaps even next to nothing at all about the psychology and sociology of ideas, mathematical physics guarantees that there is a tipping point. All the details that we do not know are irrelevant to this question.” (Buchanan p. 168)

Stephen Jay Gould says the central principle of all history is contingency.

"The historical explanation does not rest on direct deductions from the laws of nature, but on an unpredictable sequence of antecedent states, where any major change in any step of the sequence would have altered the final result. This final result is therefore dependent, or contingent, upon everything that came before — the unreasonable and determining signature of history." (quoted in Buchanan, p. 91)

If the system is a complex adaptive and self-organising, 'contingency' does not imply ‘cause and effect’. It is more of a ‘condition and influence’ process.

How to notice emergence as a pattern (i.e. to model a self-organising system)

Do not think of a system as a purely representational entity, the way you think about a book or a movie. It is partly representational: The medium is still part of the message — it's just that there is another level that our critical vocabularies are only now finding metaphors for. The difference is that the medium, the message, and the audience exist alongside a set of rules or logic that govern the way the messages flow through the system. To understand how these systems work, you have to analyse the message, the medium, and the rules. It's an algorithmic problem, then, and not a representational one. However, you can’t observe emergent behaviour just by looking at the formal instructions and rules. You have to make it live before you can understand how it works. It's the difference between playing a game of Monopoly and hanging a Monopoly board on your wall.
  • To see emergence as a pattern you needed to encounter it in several contexts. More examples are better: Studying a few ants will never lead to an understanding of the global behaviour of the colony.

  • The fundamental law of emergence is that the behaviour of individual agents is less important than the individual system. In fact, low-level ignorance is useful.

  • Think in terms of processes of learning/adapting. This makes possible the persistence of the whole over time — the global behaviour outlasts any of its component parts. Lose a few ants and it doesn't make much difference.
  • Notice patterns in the signs: Ants respond the the frequency of ant encounters and the gradient of pheromone trails, not to messages from individual ants (individually ants do not communicate).

  • Notice how the system responds to random encounters: Individual ants will stumble across a new resource which increases the adaptiveness of the whole. This reduces the possibility of getting stuck on a 'false peak'. Randomness is evolutionary adaptive behaviour. While both negative and positive feedback pathways encourage exploitation of particular sources, randomness encourages exploration of multiple sources.

  • Recognise system development is a decentralised process. Learning emerges without anyone needing to be aware of it. Information management and pattern preservation are the latent purposes of a community, because the members are driven by other overt motives, such as self-development, safety, commerce.

  • Look for network effects: Components pay most attention to their neighbours, they think locally and act locally but their collective action produces global behaviour. Local rules lead to global structure. Swarm logic leads to global wisdom — but not in a way that you wouldn’t necessarily predict from the local rules.

  • Think in terms of a dynamic equilibrium between ‘negative feedback pathways’ (which keeps the system stable despite unpredictable and changing external conditions), and 'positive feedback pathways' (which propels the system to discover new ways of behaving and leads to new emergent properties). There are down sides to both: ‘negative feedback’ risks ossification and missing future opportunities; 'positive feedback' risks chaos and loosing past achievements. Ultimately, an inappropriate balance between the two risks a loss of coherence or identity. [We can expect to see escalating interactions until a threshold is approached which triggers a dampening of the behaviour. e.g. arms race, danger of war, arms limitation.)

  • Seek 'unaverage' clues involving very small quantities, which reveal the way larger and more 'average' quantities are operating. At each scale, the laws of emergence hold true. It is the fractal nature of self-organising systems that means a large scale, long-term pattern can be replicated in the detail of a momentary interaction. The part stands for the whole, metonymically.

  • “We may wish for easier, all-purpose analyses, and for simpler, magical, all-purpose cures, but wishing cannot change some problems into simpler matters than organized complexity, no matter how much we try to evade the realities and to handle them as something different.” (Johnson, p. 51)

  • Work inductively (bottom-up), reasoning from particulars to the general rather than the reverse.

Paradox of modelling emergent properties

De Landa (A Thousand Years of Non-Linear History, 1997) says that analysing a whole into parts and then attempting to model it by adding up the components will fail to capture any property that emerged from complex interactions. However, it is not possible to analyse a whole directly (Bateson even wondered if anyone had ever perceived a whole!). And Wilber says, differentiation is necessary before integration. So there is no option but to observe parts of the system (lower level of organisation) interacting/relating in order to create a sense of the whole pattern (higher level of organisation).

We believe that bottom-up, inductive thinking is a prerequisite for conscious modelling. Yet modelling involves creating a representation of the way a system works that is congruent/isomorphic with that system. What if the exemplar (in this case the Findhorn group) thinks top-down, then our model of their thinking will also need to be top-down. But, (a) the interactions of individuals using top-down thinking will still generate emergent group properties. And (b) our tools for creating our model need not be top-down. In both cases we can learn to distinguish the product from its production process, and the particular production process from the processes of production in general.

Making use of emergence


The perceptual world of any individual is limited. There are no bird's eye views, no ways to perceive the overall system but there are ways to increase knowledge of how the system works, to deepen understanding of the natural developmental processes, and to build awareness of how each individual plays a role in maintaining the system. We contribute to emergent intelligence, but it is difficult for us to perceive that contribution, because our lives unfold on the wrong scale. Metaphor can compress those scales so that we can perceive some of our contribution to the whole.

The likelihood of a feedback pathway correlates directly to the general interconnectedness of the system. Most of the time, making an emergent system more adaptive entails tinkering with the process of feedback: Different messages, different medium (means of travelling along the pathways), or different pathways.

A simple analogy:

Medium = The Internet (the wires and computers)

Messages = The WWW pages (information stored on the Internet)

Pathways = Temporary circuits across the Internet that carry the WWW pages (including via the people using the system).

To begin to understand these mechanisms, you need additional feedback pathways which operate at a higher level.

Negative feedback is a way of indirectly pushing a fluid, changeable system toward a goal, or steady-state. It is, in other words, a way of transforming a complex system into a complex adaptive system. At its most schematic, negative feedback entails comparing the current state of a system to the desired state, and pushing the system in a direction that minimises the difference between the two states.

A feedback pathway exists only as long as (the part of the system designated) the originator continually adapts to the messages received from (the part of the system designated) the recipient or the environment. ie. Feedback pathways are always "circular chains of causation" (Bateson) or ‘circuits of contingency’.

Understanding emergence is about: giving up control; being more tolerant of that exploratory phase where the rules don't make sense and where few goals have been clearly defined; letting the system govern itself as much as possible; and letting it learn from its own footprints.

You can conquer gridlock by making the grid itself smart. The way you do that is with:
  • Autonomous agents able to make independent decisions within a framework of relatively simple rules;
  • Moderately dense network and web connections among the agents — that is, the organisation's part;
  • Vigorous experimentation by agents, disciplined by responding to feedback based on actual outcomes.
You might think that strong social links would be the crucial ones holding a network together. But when it comes to interconnectedness, they aren’t; in fact, they are hardly important at all. The crucial links are the bridges formed from "weak links" that act as ties that sow the network together. These are the shortcuts that if eliminated, would cause the network to fall to pieces. What’s more, we find here an explanation not only for why the world is small, but also for why we are continually surprised by it. After all, the weak long-distance shortcuts that make the world small are mostly invisible in our ordinary social lives.

There are drawbacks to too much clustering. To live within a cluster is to be protected from differing norms, and also from truly novel ways of thinking, patterns of behaviour, or pieces of information. The absence of weak ties to outside the cluster can have damaging consequences.

Social capital is the ability of a team to work as a a team on its own, willingly, without participation being managed by legally binding rules and regulations, the need for which is already a signal of lacking efficiency.

A linear increase in awareness can produce a non-linear change in a system (when it crosses a tipping point), a change that will be difficult to predict in advance.

D’Arcy Thompson noted that each form of organisation has an upper limit of size, beyond which it will not function.

Complexity scientists have identified a few simple rules by which complex adaptive systems operate. These rules are presented here with a translation of what they mean in a business context:
  • The source of emergence is the interaction among agents who mutually affect each other.
  • Small changes can lead to large effects.
  • Emergence is certain but there is no certainty as to what it will be.
  • Greater diversity of agents in a system leads to a richer emergent pattern — up to a point.
Therefore managers who want to think in terms of self-organisation need to:
  • Attend to relationships characterised by mutuality among people, teams and companies in order for novelty to emerge.
  • Seek to lead change through many small experiments, which search the landscape of possibilities.
  • Create conditions for constructive emergence rather than to plan a strategic goal in detail. Evolve solutions, don’t design them.
  • Seek diversity of people, cultures, expertise, ages, personalities, gender so that when they interact in teams, creativity has the potential of being enhanced.
And as facilitators we want to design activities where:
  • Diversity is valued. A minimum number of agents is needed with a variety of motives and behaviour.
  • Ignorance is useful. Low level simplicity ensures longevity of the system in the event of a component failure.
  • Random encounters are encouraged. They are required for stumbling across new resources or adapting to new conditions.
  • Patterns in the signs are noticeable
  • Attention is paid to the interaction between neighbours. Local information can lead to global wisdom.
  • Weak long distance bridges are cultivated. They tie the network together and allow innovations to propogate.

And remember ...

Language plays tricks on us. When people talk about the dynamics of complex systems, they use the language of purpose, of goal-seeking behaviour, e.g. 'a co-evolving system gets itself to the edge of chaos which is a favoured place to be'. The language of purposefulness is hard to avoid.

Emergence is multifaceted, and if you try to be too precise, you will lose what you are after. You can’t draw an easy border around it.

The Findhorn Foundation will be a manifestation of the patterns of community members’ behaviour and decision-making that have been etched into the texture of the structure of the buildings and organisation of the community. These patterns have fed back to the residents themselves, altering their subsequent decisions. A community is a kind of pattern-amplifying mechanism: its organisation is a way of measuring and expressing the repeated behaviour of larger collectives — capturing information about group behaviour, and sharing that information with the group. Findhorn didn't have a mass of regulations and planners deliberately creating the organisation. All they needed were 40 years, hundreds of individuals, and a few simple rules of interaction.

In conclusion, our aim is to create conditions whereby the persistent patterns of group interaction are amplified until they become apparent to group consciousness. Awareness of these patterns (through the use of metaphor and space) will feed back to the community. Then small shifts in behaviour can escalate into larger changes of organisation.

References
Gregory Bateson, Steps to an Ecology of Mind (1972)

Gregory Bateson, Mind and Nature, (1988)

Gregory Bateson & Mary Bateson, Angels Fear (1988)

Mark Buchanan, Nexus (2002)

Fritjof Capra, The Web of Life (1996)

Jack Cohen & Ian Stewart, The Collapse of Chaos (1995)

David Deutsch, The Fabric of Reality(1997)

Murray Gell-Mann, The Quark and the Jaguar (1994)

James Gleick, Chaos: Making a New Science (1987)

Malcolm Gladwell, The Tipping Point (2000)

Brian Goodwin, How the Leopard Changed Its Spots (1997)

Steven Johnson, Emergence (2001)

Roger Lewin, Complexity (2001)

Lynn Margulis, The Symbiotic Planet (1999)

Humberto Maturana & Francisco Varela, The Tree of Knowledge (1992)

Peter Senge, The Fifth Discipline (1990)

Ken Wilber, Sex, Ecology & Spirituality (1995)

Stephen Wolfram, A New Kind of Science (2002)

James Lawley

James LawleyJames Lawley is a UKCP registered psychotherapist, coach in business, and certified NLP trainer, and professional modeller. He is a co-developer of Symbolic Modelling and co-author (with Penny Tompkins) of Metaphors in Mind: Transformation through Symbolic Modelling. For a more detailed  biography see about us and his blog.

 
Comments
 »  Home  »  Applications  »  Business & Organisations  »  Self-Organising Systems: Findhorn
Article Options

Clean Space
Workshop


with

Marian Way

and
James Lawley

July 13-14 2017
in
London, UK



An exquisitely simple
and innovative
facilitation process.
Discover how to
use space as
your co-facilitator.

cleanlearning.co.uk

view all featured events