Editor's Section




Chaos

Chaos. Yes. Hmmm. Well, you might have expected to find a definition of a term when we begin to discuss it, but we are not going to start that way. As we will see in the future article on language, definitions are Aristotelian and Cartesian and pass?. We will in that article suggest supplanting definitions with attractors of application modeled after a H?non Attractor. Not much of the last two sentences will make sense at this point to readers without any chaos background! But be patient, because all will be revealed. Let's just say that the word 'chaos' is perhaps unfortunate and makes the subject seem more difficult to understand than it should be. We'll move in to it this way:

For most of us who learned some geometry in high school we learned to make a graph by plotting points in two dimensions with two co-ordinates, thereby creating a graph. We might even have tried to use three co-ordinates but found that more difficult unless we happened to be good at drawing. So let's think of just two co-ordinates and these would belong to the X and the Y axis. This allowed us to plot the results of equations like x + y = 1. And when we drew the straight line we might have been proud of our result - and probably a little relieved! This is an example of a linear system.

For those of us who continued in geometry we would subsequently have played with a variety of equations and drawn circles, parabolas, hyperbolas and so on. We would be dealing with x2 or even y3. We could do this without even knowing what, if anything, was being represented by the graph. Note that systems with anything like the x2 in them are non-linear. (If you didn't take geometry then we will tell you that x2 is read as 'x squared' and y3 is read as 'y cubed'. x2 is short for 'x times x', and y3 is short for 'y times y times y', or y multiplied by itself three times. So, if x is 2 then x2 equals 4; and if y is 3 then y3 equals 27) That's all the mathematical jargon we are going to use here. Other websites will satisfy your ravenous mathematical hunger!

Mathematicians and scientists have known for a long time how to represent the state of a system, not just the position of a thing over time. The state of a system can be quite simple, for example, velocity which is a combination of speed and direction plotted against time. This is a way of putting more information in a point on a graph. It can show us patterns that help us understand systems.

Each system has its own set of variables. To represent the state of a system you select some variables (sometimes intuitively, sometimes by trial and error) and you describe the system in terms of how these variables interact with each other. Clearly it is advantageous to keep it simple! The variables selected are the phase space and the pattern that emerges when the state of the system is thus graphed is an attractor.

The really interesting systems are the ones that keep running by putting the result of one set of values back into the equation and starting over again. So, for example, if you had an equation that described the growth of population of a group of people or animals and you were looking at the growth one year at a time, then you would use the total number of people or animals at the end of last year as the number you put into the equation this year. You had to in order to predict what the population would be at the end of this year. Or next year. Or any year in the future. This is like the feedback you hear when a microphone is not set properly. The result of the amplification feeds back into the microphone and quickly builds to an irritating and sometimes painful noise. This process is called iteration.

With non-linear systems (remember from your algebra these have equations with at least one variable with a power higher than 1, i.e., where you will find x2) the iterative process could not be carried out for a high number of iterations until computers became common and reasonably powerful. This was simply because of the time it took to do the calculations manually or with an old fashioned hand calculator.

So, when iterations of non-linear systems were done on computers and the results represented graphically on the monitor new information became available and new things were learned. Some of the new things did not fit well with the intuition scientists had developed within their inherited scientific cultural meme. In fact, some of the results were downright bizarre.

For some stable systems the attractor settles down into a non-repeating pattern of quasi-periodicity. The pattern of the attractor is not predictable but it exhibits a pattern. This kind of attractor is called a strange attractor. Strange attractors in nature replace the old 'balance of nature' metaphor derived from geometry. Nature is, in fact, not balanced. The example often used is that of predator and prey. As prey increases a greater population of predator is permitted. More predators eat more prey and the number of prey decreases. Then with less to eat the number of predators also decreases. So the prey increase. And so on. Yet the numbers never attain a 'balance'. But they do exhibit a pattern and that pattern is a strange attractor for that predator/prey system. (Thought experiment: What happens to the attractor if an additional factor impacts the predator/prey system? Think of pollution as an additional factor, for example.)

Strange attractors are being used more and more in various fields such as economics and medicine. In time we can expect to see the availability of software that would produce strange attractors automatically from a company's data base, or from a record of a student's study habits.

We recommend an internet search in chaos to see some of the results and the often beautiful images that represent them. But for our purposes, we can say that with computers scientists learned that their inherited intuition needed some refinement. It went against intuition to learn that even fairly simple systems did not behave the way we might expect. Some systems that seemed stable would in fact become unstable of their own accord, even though the scientists' experience would provide no suspicion that this might be the case. That's one conclusion from chaos studies: Seemingly stable systems may have an inbuilt instability that is not apparent in the equations.

Another significant result was that some systems were found to behave in a surprising way if the initial information changed even slightly. In some cases this would happen just by dropping a few decimal places of the initial values. That's a second conclusion: Some systems are extremely sensitive to their initial conditions. This is called sensitive dependence.

Systems that behave this way are called complex systems or complex dynamical systems. We will extract from chaos studies in the hard sciences tools that will have useful application in human systems. By human systems we mean the full range of systems from an individual person through small to medium sized systems such as local churches or national denominations up to global trends over centuries. And this leads into another concept that chaos has taught us.

Self-similarity is a potent tool indeed. The word 'self' is used as a qualifier because the phrase refers to occurrences that belong in the same system. As we say in the glossary, self-similarity can be quickly grasped by thinking of branching on a tree. From the big branching where the largest branches leave the trunk to the tiniest twigs at the very top there is a similarity of form and behaviour. By magnifying the smallest ones until they are about the same size as the larger one we can see how similar they are. In some systems they are identical. Identical branching is evident in some mathematical systems and their respective graphical representation and the branches which are identical over scale are called fractals. Fractals can also be seen in nature, although in nature they are usually less than identical. They are, however, recognizable as belonging to the same system.

In nature, branching seems to be self-limiting. Trees do not branch forever. They stop when mature. Mathematical constructs, however, can branch and branch until the cows come home. The beautiful images of Mandelbrot or Julia Sets are examples.

Early on in the studies of chaos it was learned that some patterns were to be found in systems totally unconnected. If we think only of the pattern of non-mathematical branching we can see a similarity in the way a tree branches from the time it bursts from its seed until its maturity and the way the rules of a game develop and branch until the game has matured. Think of the development of the rules of baseball, tennis, football and so on. Once the system has matured only minor branching will be seen. The ubiquity of these patterns allows us to compare different systems. Comparing different systems by their patterns is what this website calls thinking in phase space.

We will turn self-similarity around and use it as a tool to enhance our self-critical capacity and hence our ability to zero-base our plans. These plans may range from personal plans to large scale organizational plans. As a sociological and theological tool, self-similarity can be very revealing.

As this website develops we will replace the outmoded geometrical metaphors of language with metaphors from chaos. This is not easy because not only are the concepts still developing but also the vocabulary is still in its infancy, even among specialists. So, we hope a useful byproduct of the website will be an increased facility in chaos metaphors and a reduction in the use of metaphors geometric.

When we incorporate the concepts and vocabulary of chaos into our thinking we will have to revise the way we reach conclusions on a number of matters. Seemingly stable systems can become chaotic either with or without external influences. So, for example, when we look at origins (cosmological, geological, biological, archaeological, historical) we need to be aware that what we are looking at may in the past have not been as it now appears.

The conclusions I draw from Chaos are several: