Monthly Archives: September 2006

Friday Random Ten, Sept 29

1. **Steven Reich, “Explanations Come To An End Somewhere”**: one movement from one of Steven Reich’s recent works, the “You Are” variations. I think it’s some of the best stuff he’s ever written.
2. **Fiddlers Four, “Pickin’ the Devil’s Eye”**. Another Darol Anger project, and as usual, it’s very cool. This is basic old-time country fiddling, full of energy and fire.
3. **Seamus Egan, “To An Old Rose”**. Seamus Egan is a brilliant Irish musician; he’s one of the finest Irish flutists in the world, and he also plays tenor guitar, regular guitar, banjo, keyboards, lap steel, and who knows what else. This is a tenor guitar waltz, very mellow, very beautiful.
4. **Hamster Theatre, “La Sacre D’Merde”**. Strange but brilliant band. They’re a Rock-in-Opposition spinoff of Thinking Plague with very heavy early music influences. I also love their titles… “La Sacre D’Merde” is basically “The Rite of Shit”; the title of the album is “The Public Execution of Mr. Personality”.
5. **Rachel’s, “Old Road”**. “Rachel’s” is another of those post-rock ensembles that I’m so wild about. They’re a very classical one; the closest comparison would be to the Clogs, by Rachel’s is *better*. In fact, they’re by far the best of the PREs that I’ve found so far. There just aren’t words for this kind of music, it’s too good. And I discovered them entirely by accident! I was looking for something else, and just happened to notice this thing tagged “PRE”.
6. **Dirty Three, “Stellar”**. Yes, more post-rock. Dirty Three is wonderful.
7. **John Corigliano, “Elegy”, 2nd movement of the Clarinet Concerto .** Corigliano is one of the finest composers in the world today. As a clarinetist, I’m particularly enamoured of his clarinet concerto. This is played by Stanley Drucker. I used to think that Drucker wasn’t such a great clarinetist, based on his recordings of Mozart and Weber’s concertos. This recording completely changed my mind; Drucker is the very finest performer of modern compositions on the clarinet. It’s obvious that he’s just not as *interested* in the older stuff; the modern stuff is what he loves.
8. **The Andy Statman Klezmer Orchestra, “Galitzianer Chusud”**. There was also a piece from this group in last week’s FRT. I’m particularly fond of this one, because my father is a Galtizianer. (Galitzianers are Jews from the Galicia region of what was Russia when they lived there. My mother’s family are Litvak’s (Latvians). The Litvaks and the Galitzianers were traditionally rivals; the old joke among Litvaks is that if you’re not careful, your children will grow up to marry a Galitzianer.)
9. **Rachel’s, “And Keep Smiling”**. Two Rachel’s tracks in one FRT! Hurrah!
10. **Psychograss, “Stroll of the Mudbug”**. Psychograss is yet another Darol Anger project, also featuring my former banjo teacher, Tony Trischka, not to mention David Grier, Mike Marshall, and Todd Phillips. A group of musicians with more skill than this gang is damned hard to find, and they have a *great* chemistry as a group. It’s newgrass with a good bit of jazz.

Programming without Control: Smith

Joy of joys, [Cat’s Eye Technologies](http://catseye.mine.nu:8080/projects/), the home of Chris Pressey, one of the most prolific designers of thoroughly bizarre languages is back up! And so, today, we’re going to take a look at one of his masterpieces, the language *Smith*. This is one of my personal all-time favorite pathological languages; it’s positively brilliantly twisted.
Smith is, mostly, a pretty typical memory-oriented machine language. So what makes it pathological? It’s got *no* jumps, *no* branches, *no* control flow whatsoever. The program counter starts at zero, and every step, it increments by one *and only one*.
So, you might quite sensibly ask, how on earth could this beast do anything useful, much less be Turing complete? The answer is: self-modifying code; in particular, the program can *copy* its own instructions.

Continue reading

Computing Square Roots on Paper

To do a square root on an abacus, you use partitions to do a paper algorithm for square root using the abacus. The catch is that most people don’t even *remember* how to do square roots on paper, if they ever learned it at all. (In fact, in school, *I* didn’t learn the classical paper algorithm; we never really did roots on paper; the closest we did was using square root as an example of Newton’s method. Like so much of my basic math, I learned this from my father.)
So, for your entertainment and edification, today, I’ll describe the classical algorithm for computing square roots on paper. I’ll also show you just why it works.

Continue reading

Connectedness

Next stop on our tour of topology is the idea of *connectedness*. It’s an important concept that defines a lot of useful and interesting properties of topological spaces.
The basic idea of connectedness is very simple and intuitive. If you think of a topology on a metric space like ℜ3, what connectedness means is, quite literally, connectedness in the physical sense: a space is connected if doesn’t consist of two or more pieces that never touch.
Being more formal, there are several equivalent definitions:
* The most common one is the definition in terms of open and closed sets. It’s precise, concise, and formal; but it doesn’t have a huge amount of intuitive value. A topological space **T** is connected if/f the only two sets in **T** that are both open *and* closed are **T** and ∅.
* The most intuitive one is the simplest set based definition: a topological space **T** connected if/f **T** is *not* the union of two disjoint non-empty closed sets.
* One that’s clever, in that it’s got both formality and intuition: **T** is connected if the only sets in **T** with empty boundaries are **T** and ∅.
Closely related to the ida of connectedness is separation. A topological space is *separated* if/f it’s not connected. (Profound, huh?)
Separateness becomes important when we talk about *subspaces*, because it’s much easier to define when subspaces are *separated*; and they’re connected if they’re not separated.
If A and B are subspaces of a topological space **T**, then they’re *separated in **T*** if and only if they are disjoint from each others closure. An important thing to understand here is that we are *not* saying that their *closures* are disjoint. We’re saying that A and B* are disjoint, and B and A* are disjoint, not that A* and B* are disjoint.
The distinction is much clearer with an example. Let’s look at the topological space ℜ2. We can have *A* and *B* be *open* circles. Let’s say that *A* is the open circle centered on (-1,0) with radius one; so it’s every point whose distance from (-1,0) is *less than* 1. And let’s say that *B* is the open circle centered on (1,0), with radius 1. So the two sets are what you see in the image below. *A* is the shaded part of the green circle. The outline is the *boundary* of *A*, which is part of *A**, but not part of *A* itself. *B* is the shaded part of the red circle; the outline is the boundary. Neither *A* nor *B* include the point (0,0). But both the *closure* of *A* and the *closure* of *B* contain (0,0). So *A* is disjoint from *B**; and *B* is disjoint from *A**. But *A** is *not* disjoint from *B**: they overlap at (0,0). *A* and *B* are separated, even though their closures overlap.
circles.jpg
There’s also an even stronger notion of connectness for a topological space (or for subspaces): *path*-connectedness. A space **T** is *path connected* if/f for any two points x,y ∈ **T**, there is a *continuous path* between x and y. Of course, we need to be a bit more formal than that; what’s a continuous path?
There is a continuous path from x to y in **T** if there is a continuous function *f* from the closed interval [0,1] to **T**, where *f(0)=x*, and *f(1)=y*.

De-Debunking Evolutionary Algorithms

Just for fun, I’ve been doing a bit of poking around lately in evolutionary algorithms. It’s really fascinating to experiment, and see what pops out – the results can be really surprising.
There is one fascinating example for which, alas, I’ve lost the reference, but here’s the summary. Several groups have been looking at using evolutionary algorithm techniques for hardware design. (A good example of this is [Alexander Nicholson’s work](http://citeseer.ist.psu.edu/nicholson00evolution.html) .) A year or so ago, I saw a talk given by a group which was doing some experiments with EA for hardware design. One of the most interesting outcomes of their work was that for one of the solutions that their system generated, they were *completely* unable to comprehend how it worked. There was just no logical way for it to work. It included two *disconnected* sets of components – one of which wasn’t wired in *at all*. When they tried getting rid of the disconnected set, the circuit stopped working. It turned out that the evolutionary process had discovered and exploited a previously unknown bug/behavior in the FPGA they were using. You can read about this work [here](http://www.cogs.susx.ac.uk/users/adrianth/ices96/paper.html); thanks to commenters for helping me find the link!
Anyway, the point here isn’t to talk in detail about evolutionary algorithms; that’s a fascinating topic for another time. My goal for this evening is to show you yet another example of how creationists like to distort math in order to make bad arguments. The specific target this time around is an article by Eric Anderson called [“Bits, Bytes and Biology: What Evolutionary Algorithms (Don’t) Teach Us About Biology”](http://www.iscid.org/pcid/2005/4/2/anderson_bits_bytes_biology.php). This article looks at some of the work on evolutionary algorithms that came out of [the Avida project](http://dllab.caltech.edu/avida/) , and tries to make an argument for why the demonstration of how Avida created “irreducibly complex” results are invalid.

Continue reading

Topological Products Redux: Categories to the rescue!

This is going to be a short but sweet post on topology. Remember way back when I started writing about category theory? I said that the reason for doing that was because it’s such a useful tool for talking about other things. Well, today, I’m going to show you a great example of that.
Last friday, I went through a fairly traditional approach to describing the topological product. The traditional approach not *very* difficult, but it’s not particularly easy to follow either. The construction isn’t really that difficult, but it’s not easy to work out just what it all really means.
There is another approach to presenting it using category theory, and to me at least, it makes it a *whole* lot easier to grasp. To make the diagrams easier to draw, I’ll adopt one shorthand: instead of writing (T,τ) for topological spaces, I’ll use a single symbol, like **X**, with the understanding that **X** represents the *pair* of the set and the topology that form the topological space.
Suppose we have a set topological spaces, **E**1, **E**2, …, **E**n. The product **P** = Πi=1..n**E**i is the *only* topological space with projection functions pi : **P** → **E**i, such that
for any other topological spaces **S**, if **S** has continuous functions fi : **S** → **E**i to each of the elements of the product, then there is *exactly one* continuous function g : **S** → **P** such that the following diagram commutes:

product.jpg

That’s really just a repetition of the definition of categorical product, just made specific to the category **Top**. Everything I said in fridays post about what forms the open sets of the topological product space is directly implied by this categorical definition. The property of the open sets of the product topology being the coarsest structure of sets that maintains the structural properties of the product element topologies – that’s implied by the categorical description.
To me, this is the real beauty of category theory, and the whole reason why I spent all that time explaining it. Being able to describe structures in the language of category theory makes things much easier to understand.

Division on the Abacus

Now we’re going to try something challenging on the abacus: *division*. Like multiplication, abacus division is close to the way you’d do it on paper. But just like doing paper division is trickier than paper multiplication, abacus division is tricker than abacus multiplication. But the technique that is used to do division on the abacus is an important fundamental one: it’s what makes it possible to use the abacus for more advanced operations, like roots.

Continue reading

Friday Random Ten, Sept 22

It’s friday again, so in addition to a bizzare programming language, you get a random ten.
1. *Transatlantic, “Mystery Train”.*: very cool neo-prog rock track.
2. *Darol Anger and the Republic of Strings, “Dzinomwa Muna Save”.* Darol Anger is one the most creative artists of our generation. He’s a violinist who is constantly out pushing his limits. He’s played classical, jazz, bluegrass, folk, rock, and stuff that just can’t be classified. This tune is his take on a traditional african song, performed by his latest band. Brilliant, amazing, fascinating, and beautiful.
3. *Bach, “Erkenne Mich, Mein Hueter” from “St. Matthews Passion”*. One and one half minutes of sheer perfection. Bach is, in my opinion, the greatest composer of all time, and the St. Matthews Passion is one of his finest works.
4. *The Andy Statman Klezmer Orchestra, “Golden Wedding”.* Andy Statman is an amazing musician who comes from the same family of musicians as Darol Anger. A few years ago, he rediscovered his Jewish roots, and ended up being an Orthodox jew. As part of that exploration of his roots, he started playing Klezmer. It’s frankly *shocking* to see how well he can play klezmer after such a short time.
5. *Hamster Theatre, “Litost”*. Strange, strange stuff. HT is a RIO offshoot of “Thinking Plague”. They describe themselves as “straddling the edges of folk music, avant-garde, world music, early 20th century French composers, such as Erik Satie and Maurice Ravel, contemporary composition and many other musical forms, bringing together elements of all these styles while never sounding ‘just like’ any one of them.” I’d say that’s a pretty darned good description.
6. *The Clogs, “Compass”.* Post-rock from one of the best classical-leaning post-rock ensembles. I really *love* post-rock, and there’s no one who does it better than the Clogs.
7. *Godspeed You Black Emperor, “Antennas To Heaven: Moya Sings “Baby-O” / Edgyswingsetacid / Glockenspeil / “Attention… Monami… Fa-Lala-Lala-La-La / She Dreamt She Was A Bulldozer, She Dreamt She Was In An Empty Field / Deathcamp Drone / Antennas To Heaven”*. My but that’s a whopper of a name. More post-rock, but GYBE is more on the electric side of the genre.
8. *Thinking Plague, “Marching as to War”.* Cousin to this weeks number 5. A deeply strange band; very clearly influenced by King Crimson. They’re part of the “Rock in Opposition” movement; very similar to my beloved post-rock, but with a bit more atonality.
9. *Frank Zappa, “Valley Girl”.* One of Zappa’s sillier tracks. Not one of my favorites, frankly.
10. *Phish, “Rift”*. I don’t know why so many people hate Phish. Sure, they had some pretty damned annoying fans. But they wrote and played really great music. I particularly love this album.

Worlds Greatest Pathological Language: TECO

I’ve got a real treat for you pathological programming fans!
Today, we’re going to take a quick look at the worlds most *useful* pathological programming language: TECO.
TECO is one of the most influential pieces of software ever written. If, by chance, you’ve ever heard of a little editor called “emacs”; well, that was originally a set of editor macros for TECO (EMACS = Editor MACroS).
As a language, it’s both wonderful and awful. On the good side, The central concept of the language is wonderful: it’s a powerful language for processing text, which works by basically repeatedly finding text that matches some kind of pattern, taking some kind of action when it finds it, and then selecting the next pattern to look for. That’s a very natural, easy to understand way of writing programs to do text processing. On the bad side, it’s got the most god-awful hideous syntax ever imagined.

Continue reading

Using the Abacus, part 2: Multiplication

Once you can add on an abacus, the next thing to learn is multiplication. Like addition, it follows pretty closely on the old pencil-and-paper method. But it’s worth taking the time to look closely and see it step by step, because it’s an important subroutine (to use a programming term) that will be useful in more complicated stuff.

Continue reading