One thing that I wanted to do when writing about Chaos is take a bit of time to really home in on each of the basic properties of chaos, and take a more detailed look at what they mean.
To refresh your memory, for a dynamical system to be chaotic, it needs to have three basic properties:
- Sensitivity to initial conditions,
- Dense periodic orbits, and
- topological mixing
The phrase “sensitivity to initial conditions” is actually a fairly poor description of what we really want to say about chaotic systems. Lots of things are sensitive to initial conditions, but are definitely not chaotic.
Before I get into it, I want to explain why I’m obsessing over this condition. It is, in many ways, the least important condition of chaos! But here I am obsessing over it.
As I said in the first post in the series, it’s the most widely known property of chaos. But I hate the way that it’s usually described. It’s just wrong. What chaos means by sensitivity to initial conditions is really quite different from the more general concept of sensitivity to initial conditions.
To illustrate, I need to get a bit formal, and really define “sensitivity to initial conditions”.
To start, we’ve got a dynamical system, which we’ll call f. To give us a way of talking about “differences”, we’ll establish a measure on f. Without going into full detail, a measure is a function which maps each point x in the phase space of f to a real number, and which has the property that points that are close together in f have measure values which are close together.
Given two points x and y in the phase space of f, the distance between those points is the absolute value of the difference of their measures, .
So, we’ve got our dynamical system, with a measure over it for defining distances. One more bit of notation, and we’ll be ready to get to the important part. When we start our system at an initial point , we’ll write it .
What sensitivity to initial conditions means is that no matter how close together two initial points and are, if you run the system for long enough starting at each point, the results will be separated by as large a value as you want. Phrased informally, that’s actually confusing; but when you formalize it, it actually gets simpler to understand:
Take the system with measure . Then f is sensitive to initial conditions if and only if:
- Select any two points and such that:
- Let diff(t) = . (Let diff(t) be the distance between and at time .)
- (No matter what value you chose for G, at some point in time T, diff(T) will be larger than G.)
Now – reading that, a naive understanding would be that the diff(T) increases monotonically as T increases – that is, that for any two values and , if with measure . And for our non-chaotic system, we’ll use , with .
Think about arbitrarily small differences starting values. In the quadratic equation, even if you start off with a miniscule difference – starting at v0=1.00001 and v1=1.00002 – you’ll get a divergence. They’ll start off very close together – after 10 steps, they only differ by 0.1. But they rapidly start to diverge. After 15 steps, they differ by about 0.5. By 16 steps, they differ by about 1.8; by 20 steps, they differ by about 1.2×109! That’s clearly a huge sensitivity to initial conditions – an initial difference of 1×10-5, and in just 20 steps, their difference is measured in billions. Pick any arbitrarily large number that you want, and if you scan far enough out, you’ll get a difference bigger than it. But there’s nothing chaotic about it – it’s just an incredibly rapidly growing curve!
In contrast, they logistic curve is amazing. Look far enough out, and you can find a point in time where the difference in measure between starting at 0.00001 and 0.00002 is as large as you could possibly want; but also, look far enough out past that divergence point, and you’ll find a point in time where the difference is as small as you could possible want! The measure values of systems starting at x and y will sometimes be close together, and sometimes far apart. They’ll continually vary – sometimes getting closer together, sometimes getting farther apart. At some point in time, they’ll be arbitrarily far apart. At other times, they’ll be arbitrarily close together.
That’s a major hallmark of chaos. It’s not just that given arbitrarily close together starting points, they’ll eventually be far apart. That’s not chaotic. It’s that they’ll be far apart at some times, and close together at other times.
Chaos encompasses the so-called butterfly effect: a butterfly flapping its wings in the amazon could cause an ice age a thousand years later. But it also encompasses the sterile elephant effect: a herd of a million rampaging giant elephants crushing a forest could end up having virtually no effect at all a thousand years later.
That’s the fascination of chaotic systems. They’re completely deterministic, and yet completely unpredictable. What makes them so amazing is how they’re a combination of incredibly simplicity and incredible complexity. How many systems can you think of that are really much simpler to define that the logistic map? But how many have outcomes that are harder to predict?