Iterated Function Systems and Attractors

Most of the fractals that I’ve written about so far – including all of the L-system fractals – are
examples of something called iterated function systems. Speaking informally, an iterated function
system is one where you have a transformation function which you apply repeatedly. Most iterated
function systems work in a contracting mode, where the function is repeatedly applied to smaller
and smaller pieces of the original set.

There’s another very interesting way of describing these fractals, and I find it very surprising
that it’s equivalent. It’s the idea of an attractor. An attractor is a dynamical system which, no matter what its starting point, will always evolve towards a particular shape. Even if you
perturb the dynamical system, up to some point, the pertubation will fade away over time, and the system
will continue to evolve toward the same target.

The classic example of this is something called the chaos game. In the chaos
game, you take a sheet of paper, and draw three points, A, B, and C. Then you pick a point P0, something where inside the triangle formed by A,B, and C. Then you do the following repeatedly, for each iteration i+1:

  1. Randomly select one of A, B, and C.
  2. Draw a line segment between the most recent pi and the selected point. Find
    the midpoint of that segment. That is point pi+1. Draw a dot on pi+1

sier-attract.png

Keep doing that, over and over again. If you try this yourself for a few iterations, you’ll see
why it’s called the chaos game: the point jumps around seemingly at random within the triangle. The
set of points that results from this seemingly random process is the attractor for the
dynamical system described by the chaos game. For example, over to the right, I started with P0, and then did iterations with line segments from p0 to A, from p1 to C, from p2 to C, p3 to B, and from p4 back to C.

So what’s the result? What does the attractor for the chaos game look like?

sier-shear.png

It’s the Sierpinski gasket, distorted so that the triangles are shaped the same as the
triangle defined by A, B, and C:

For any of the L-system fractals, there’s a dynamical system for which the fractal is
the attractor.

Let’s step back just for a moment. What’s going on here? The starting point is the dynamical system.
Just what is a dynamical system?

A dynamic system is basically a kind of differential equation – it’s a rule that specifies a future
state of a system in terms of a previous state of the system. The states in this system are really points
within some kind of topologial manifold. Every dynamical system has something called a state space
– that is, the space which includes the set of points within the manifold that are reachable by some
sequence of iterations of the system.

On some level, the basic study of differential equations – what most of us math geeks studied in our
diffeq courses in college is largely about trying to find closed form equivalents of a given
dynamical system: that is, a solution for the system where you can find the value of the state of a system
at a specified point in time in one step, without needing to iterate through its previous states.

The attractor is the set defined by a dynamical system. What makes it an attractor is the somewhat
odd property that instead of varying
wildly, the attractor will always try to push the system back into its basic space. Its state space
is strongly centered on some basic set of points, and it will always drift back towards those points. Fractal attractors (also called strange attractors) are dynamical systems with a fractal
structure. The chaos game contains embedded within it the self-similarity property of the
Sierpinski gasket; so its attractor set has that structure.

For another fascinating example of this, taken from this excellent explanation of
iterated function systems
, here’s a more complex dynamical system. The
way that it works is that there are 4 possible rules for getting the next point, each of which
has an assigned probability. The next rule is selected randomly, according to the specified
probability distribution.

Rule Probability Next X Next Y
1 1/100 0.16 0x+0y
2 85/100 0.85x+0.04y -0.04x+0.85y+1.6
3 7/100 0.2x-0.26y 0.23x+0.22y+1.6
4 7/100 -0.15x+0.28y 0.26x+0.24y+0.44

This is the classic Barnsley Fern fractal:

fb.gif

That strange set of equations contains the structure of a fern.

One of the interesting things about the notion of fractal attractors is that one of the objections
to fractals is that they’re too irregular, too strange. Mathematicians tend to like things to be clean and simple. They like continuous, differentiable functions; they like their diffEQs to have closed forms. The idea that most functions are irregular, and that the functions we’ve spent so much time studying are less than a grain of sand on the beach is a depressing one to most people in math. One
of the things that originally caused some resistance to the idea of fractals is that they’re not from the world of clean functions, the world of regularity and continuity that we so adore.
But the fact that attractors include fractals – that out of the universe of iterative systems, nearly
all of them escape – turn into erratic scattering chaos, except for the tiny spec of dust that we call attractors – that tells us that fractals are part of the world of things that we can analyze and understand, even if they seem strange.

We can understand what these iterative functions systems do – how they generate the fractal attractors – by looking at them as affine transformations, and appyling techniques from conventional euclidean geometry to them. Next post, I’ll show you why the chaos game is just another way of defining the
Sierpinski gasket.

0 thoughts on “Iterated Function Systems and Attractors

  1. Blake Stacey

    I’ve always heard “attractor” used to refer to the region in phase space to which the system’s trajectory is “attracted”, rather than the entirety of the dynamical system.

    Reply
  2. Mark C. Chu-Carroll

    Blake:
    I’ve always heard it used for both the attracted-to region of the phase space, and for a dynamical system that attracts-to a particular region.

    Reply
  3. Torbjörn Larsson, OMa

    An attractor is a dynamical system which, no matter what its starting point, will always evolve towards a particular shape. Even if you perturb the dynamical system, up to some point, the perturbation will fade away over time, and the system will continue to evolve toward the same target. … A dynamic system is basically a kind of differential equation – it’s a rule that specifies a future state of a system in terms of a previous state of the system.

    More generally, a dynamic system is often describe by a differential equation, but it can as well be a difference equation for non-continous systems or a functional description. The ubiquity of differential/difference equations is that they often describe the general and local physics, while the specific global solution of course depends on initial and boundary conditions.
    Btw, AFAIK the robustness of attractors doesn’t stop with dynamical robustness of evolving to a similarly featured attractor from a perturbed state, it is also a topological robustness of evolving to a similarly featured attractor from a perturbation in the system (model parameters).
    Terence Tao who among other things is interested in the properties of dynamical systems, in keeping with the synchronicity of the web happens to post about “Knots and dynamics”:

    This attractor arises from a fairly simple dynamical system in R^3, namely the Lorenz equations, … This phenomenon is in fact quite robust; any small perturbation of the above system will also possess a strange attractor with similar features.

    Reply
  4. Torbjörn Larsson, OM

    Sorry about the many typos in my previous comment.
    Also, the quote from Tao is [R^3] really, I can’t reproduce his fancy fonts.

    Reply
  5. Doug

    [It’s the idea of an attractor. An attractor is a dynamical system which, no matter what its starting point, will always evolve towards a particular shape. Even if you perturb the dynamical system, up to some point, the pertubation will fade away over time, and the system will continue to evolve toward the same target.]
    Geometrically I do find that interesting. But, in numerical terms I’d guess this would seem “trivial” or “more trivial”. At least to me, since in numerical terms I’d draw an analogy like this: A “dynamical system” corresponds to a convergent sequence like (1/2, 1/3, 1/4… 1/n). Not matter where you start such a sequence, it converges to 0. In terms of “perturbing the system” one draws an analogy by deleting terms of the sequence. It’ll still converge to 0. A convergent sequence still “evolves” or “converges” to its limit, even if you “peturb” it, the difference of real values and the limit of the sequence go to 0, thus indicating the numbers “evolve” towards the same “target” (limit). I don’t know how deeply that analogy holds, and of course sequences work out as far more signficant than I’ve made them out as. Thoughts?
    [Then you pick a point P0, something where inside the triangle formed by A,B, and C.]
    This didn’t seem quite clear. Do you mean a point inside the area of the triangle? A point Po, somewhere inside the triangle ABC?
    [Then you pick a point P0, something where inside the triangle formed by A,B, and C.]
    Extending the numerical analogy, we have a set of points which several subsequences converge to members of the set. For instance the sequence
    (2, 1, 3/2, 1/2.. 1+1/(2^(n-1)), 1/2^(n-1)) has two subsequences, each of with converges to a member of {1, 0}. Or perhaps we don’t start with point numbers. We start with intervals which converge to another interval, or a sequence which has subsequences which converge to a set of intervals. Such as {[1, 2], [4, 6], [1/2, 3/2], [3, 5], …
    [1-1/2-1/4-…, 2-1/2-1/4-…] [4-1-1/2-1/4-…, 6-1-1/2-1/4-…]} which has two subsequences which converge to the members of the set of interval numbers
    {[0, 1], [2, 4]}. Defining rectangles by areas one could even extend the analogy further which have subsequences which converge to rectangles. I’ll suggest one could even do this numerically for circles even or shape describable in analytic terms.
    [ So what’s the result? What does the attractor for the chaos game look like?
    It’s the Sierpinski gasket, distorted so that the triangles are shaped the same as the triangle defined by A, B, and C]
    I guess my attempted analogy breaks down here, or I missed something crucial. I suppose convergence works too linearly, even if you consider convergence of subsequences involving intervals or shapes? Do you know of a proof that demonstrates that the chaos games attracts to the Sierpinski gasket?
    [The idea that most functions are irregular, and that the functions we’ve spent so much time studying are less than a grain of sand on the beach is a depressing one to most people in math.]
    Sounds more like getting rid of a philosophical illusion to me. I applaud anyone who actually does such wholeheartedly. I really don’t get why people would feel “depression” here myself. We need not to have anything close to omniscience. We need not know all, we need not know all.
    If there exist so many more functions than our “grains of sands on the beach”, there exist so many more possibiliities for mathematical developlment. There exist so many more choices of mathematical problems to study. There exists something like, if not actually, an “infinite quest,” something like how some spiritual people view understanding their deity and other people’s conceptions of such as an infinite quest. Or maybe how an artists views that there exists so many more possible paintings, or how a philosophical musician considers that there exists so many more melodies to play or sing. This implies that mathematics consists of a hugely open book, and that there exist multiple avenues for creativity.

    Reply
  6. Cale Gibbard

    I’d say physicists seem to like their functions to be really regular more than mathematicians.
    Modern mathematicians love strange functions with properties that clarify the meanings of the definitions. A fairly recent example was the construction of a function which is everywhere differentiable, but nowhere monotone on R. That is, the sets of points on which its derivative is negative, positive and zero are all dense in the real line.
    Personally, I rather like hearing about that sort of thing — you think you know something about how functions behave, but it just turns out that you just hadn’t seen any counterexamples yet. 🙂
    The paper “Lineability and spaceability of sets of functions on R” discusses this and a few other rather “strange” sorts of functions.
    As another example from that very paper, consider the set of “everywhere surjective” functions, that is, functions which send any open interval, no matter how small, to the whole real line. This set contains a linear subspace of dimension 2^c, which is of course the largest possible dimension for a set of functions R -> R.

    Reply
  7. Torbjörn Larsson, OM

    I’d say physicists seem to like their functions to be really regular more than mathematicians.

    True, up to a point. Seems both nature and physicists likes to keep it simple when it comes to phenomena and useful theories.
    But in the borderland to technology/experiments and odd phenomena, chaos noise and attractors aren’t strangers to physics either. For example, AFAIK you can use chaos for encryption (locking systems over a ‘scrambled’ channel, or use (pseudo)noise to enhance oscillator robustness and signal capture et cetera.

    Modern mathematicians love strange functions with properties that clarify the meanings of the definitions.

    That seems like an excellent idea. Physicists stress test theories and definitions too, and pushing models to limits reveals a lot about the soundness of the constructions. But the boundaries of applicable knowledge is narrower, especially if physical intuition fills in gaps that a full formalism hasn’t bridged yet.
    [Joke about waiting for mathematicians proving theoretical physics formalism caved in when applied to string theory absence of non-perturbed solutions. :-P]

    Reply
  8. Monte Davis

    Mathematicians tend to like things to be clean and simple.
    Maybe less a matter of their temperamant than like the man searching for his keys under the streetlight, not because he lost them there but because that’s where he could see. Continuous, differentiable, closed-form were “popular” because the alternatives led you so quickly into a thicket of expansions and approximations and dicey assumptions.
    the functions we’ve spent so much time studying are less than a grain of sand on the beach…
    I had friends at the Courant as Abraham poked a stick into dynamical systems, and the early Mandelbrot papers and books were coming out. I saw a little bit of distaste for the “pathological” (a term Mandelbrot highlights for its rhetorical sting), but a lot more excited pleasure at the advent of new flashlights to take into the thicket.

    Reply
  9. Monte Davis

    Cale: you think you know something about how functions behave, but it just turns out that you just hadn’t seen any counterexamples yet. 🙂
    In astronomy, the first decade of data about extra-solar planets has blown away 250 years of theorizing about how you’d always see small rocky planets close in, gas giants farther out. “Sample size one” science is the ultimate hostage to fortune.
    The real fun of finding extraterrestrial life will be learning how wrong we were about which characteristics really are dictated by chemistry/physics, and which are local and contingent.

    Reply
  10. Jonathan Vos Post

    Recent thread on Terry Tao’s blog:
    2006 ICM: Étienne Ghys, “Knots and dynamics”
    2006 International Congress of Mathematicians
    http://terrytao.wordpress.com/2007/08/03/2006-icm-etienne-ghys-knots-and-dynamics/#more-168
    “… the paradigmatic example of a strange attractor from chaos theory, namely the Lorenz attractor discovered in 1963…. This attractor (also known, for obvious reasons, as the Lorenz butterfly) arises from a fairly simple dynamical system… It turns out that a typical trajectory in this dynamical system will eventually be attracted to the above fractal set (which has dimension slightly larger than 2). This phenomenon is in fact quite robust; any small perturbation of the above system will also possess a strange attractor with similar features.”
    “The topology of this attractor was clarified by Birman and Williams, who had the idea of analysing the attractor via its periodic orbits, which they interpreted individually as knots (and collectively as links) in R^3; the philosophy being that the attractor itself was a kind of “infinite limit” of these knots (somewhat analogously to how the unit circle R/Z could be viewed as a limit of its finite subgroups (1N)Z/Z. There are many such periodic orbits in the Lorenz system, but it turns out that they have substantial combinatorial structure, and so the knot and link types of these orbits are actually very special, and classified by Birman and Williams (together with a more recent paper of Tucker)…

    Reply
  11. Paul Carpenter

    I don’t have anything clever to say right now. But I am enjoying this series of blogs. I’ve been inspired to read up on dynamical systems.

    Reply
  12. Torbjörn Larsson, OM

    Jonathan Vos Post:
    Ahem. Consider yourself scooped with a full day. (Comment #3.) 😛

    Reply
  13. papin

    My usage of the term attarctor agrees with Blake in Comment 1:
    an attractor is a subset of points from the points that represent the state of a dynamical system. Intuitively it is the set of points that a dynamical system will end up after a long time. Defining an attractor precisely is rather difficult because of some additional properties one may want to give it: unique, stable against small perturbations, etc…

    The most common way in textbooks to define a dynamical system is by using differential equations. Before Poincaré, solving a dynamical system meant finding an easy to compute function that solved the differential equation. But Poincaré’s work led the way in concluding that the function was too much to ask for and more important, not even relevant to understanding the dynamical system. Today, solving a dynamical systems means understanding the class of possible solution. That knowledge is usually sufficient to write great and efficient numerical code.

    Reply
  14. Jonathan Vos Post

    Torbjörn Larsson:
    Whoops! You win. My only excuse is that August 3, 2007, I gave the 12-page Algebra 1 Final Exam to my impovershed teenage summer school classes. Then I literally stayed up all night grading, carefully, all 48 students’ reposnes to all 75 questions, and correlating that with how they did on the midterm exam and the homework assignments. August 4, 2007, I told all the students what their grades were, handed out the 12-page answer sheets, and entered all final grades into the dreadful online web-based district grading and attendance system. Chalk it up to severe sleep deprivation. Oh, wait. No chalk anymore, Whiteboard it up, then.
    Friday, after final grades were given out, there was still a day where students could come to class. Why? because the school collected $3.14 per registered student per classroom hour. $3.14, suspiciously close to pi bucks an hour. The quarter of my class that came got what I bought out-of-pocket: an 18-inch vegetraian Special pizza from Domenico’s (since 1960), soft drinks, chips, fresh grapes, and a chance to play Dragon-Ball Z on a school TV/DVD system wheeled into the classroom, and Chess for those who liked chess. And friendly conversation.
    A teacher is never really the students’ friend. But the teacher need not be the enemy.
    The Principal want me to email him a short essay on what I did right, what I did wrong, what I learned, and what I’d do differently. Then he’ll decide whether or not to hire me as full-time Science teacher for Chemistry, Earth Sciences, and Biotechnology.
    So I was both exhausted and on tenterhooks.
    But thanks for correcting me.

    Reply

Leave a Reply to Torbjörn Larsson, OM Cancel reply