ISCID and the Definition of Specified Complexity

A while ago, I wrote about Dembski’s definition of specified complexity, arguing that it was a non-sensical pile of rubbish, because of the fact that “specified complexity” likes to present itself as being a combination of two distinct concepts: specification and complexity. In various places, Dembski has been fairly clear that his complexity is equivalent to Kolmogorov-Chaitin information complexity, meaning that a complex entity has *high* K-C information content; and in [my debunking of a paper where Dembski tried to define specificiation][debunk-spec], I argue that his definition of specification is basically an entity with *low* K-C information content. Put those two together, and an entity with specified complexity is “An entity with simultaneously high and low K-C information content.”
In the comments on that post, and in some rather abusive emails, some Dembski backers took me to task, alleging that I was misrepresenting the IDists view of specified complexity; that
the definition of specification used by IDists was *not* low K-C complexity, and that therefore, the definition of specified complexity was *not* self-contradictory.
Well, I was just doing a web-search to try to find some article where Dembski makes his case for a fourth law of thermodynamics, and in the search results, I came across a [very interesting discussion thread][iscid-thread] at ISCID (the “International Center for Complexity, Information, and Design”, an alleged professional society of which William Dembski is a fellow in mathematics). In this thread, Salvador Cordova is trying to make an argument that Dembski’s “Fourth Law” actually subsumes the second law. In the course of it, he attempts to define “Specified Complexity”. This thread started back in the spring of 2005, and continues to this day.
>The definition of Specified Complexity you gave is closer to Irreducible Complexity.
>Specified Complexity has this thing that is called “Probabilistic Complexity” which means simply
>that it’s improbable.
>These defintions are understandably confusing at first, but surmountable.
>We have many complexities involved, and seriously each one should be explored, but I’ll have to go
>into the details later:
>* Probabilistic Complexity
>* Specificational Complexity
>* Specified Complexity
>* Irreducible Complexity
>* Kolmogorov Complexity
>All of these are in Dembski’s book, and should be treated with care, lest one becomes totally
>confused. The diagram addresses 3 of the 5 complexities listed above explicitly.
>Probabilistic and Specificational Complexity require a separate discussion.
>Specified complexity has these features (per Design Revolution, page 84)
>1. Low Specificational Complexity
>2. High Probabilistic Complexity
Sal attempts to wiggle around, but low specificational complexity is, in his own words, means that
an entity that has low specification complexity is one whose description has *low* K-C complexity. Probabilistic complexity is, as far as I know, Sal’s addition. In Dembski’s writings, he’s been
fairly clear that complexity is complexity in the information theory sense – see the Dembski paper linked above, which quotes him explaining why “probabilistic complexity” isn’t sufficient and introducing K-C complexity to fix it.
So – one of Dembski’s associates, in an extensive message thread quoting Dembski’s books, says that specification is *low* K-C complexity; and tries to wiggle around the fact that the complexity part of “specified complexity”, when examined in terms of information theory means that the K-C complexity is high. Ergo, specified complexity as described by Dembski, *is* the property of simultaneously having both high and low K-C complexity.
Sad, isn’t it?

0 thoughts on “ISCID and the Definition of Specified Complexity

  1. Noodle

    You say:
    “Sal attempts to wiggle around, but low specificational complexity is, in his own words, means that an entity that has low specification complexity is one whose description has low K-C complexity”
    Ok, one of the bolded words has to go or the sentence needs to be re-written please. Otherwise, keep up the good work. Math has never been my strong point so I appreciate the various takedowns of bad math. Not to mention that, as a programmer, I really love the pathological languages.

  2. steve s

    ‘Course, Dembski has an easy out here.
    “Salvador Cordova is a complete retard. Have you read any of his posts? He’s certifiable. I should never be held to anything he has said, or will say in the future.”

  3. Corkscrew

    I wouldn’t say he’s a retard, but AFAICT he doesn’t have any grasp of advanced mathematics beyond the absolute basics of information theory (and possibly some applied stuff, I wouldn’t know). I remember having a long, fairly amicable, discussion with him at one point in which he took about a dozen posts to define Shannon entropy.
    “Yeah, I took that course last week. Your point is?”

  4. Joe Shelby

    Ergo, specified complexity as described by Dembski, is the property of simultaneously having both high and low K-C complexity.
    Hence, proof of an Intelligent Designer – only some being with far more intelligence than us mere humans could have created a mathematical contradiction that obviously still exists before our eyes, right?
    Pardon me while I go back to drinking my no tea.

  5. Blake Stacey

    Ah, the ISCID — such a paragon of scholarship. This post reminds me of one of the siller abuses of mathematical jargon I’ve stumbled across in my Net voyages, the “Cognitive-Theoretic Model of the Universe” by Christopher Michael Langan. (You might remember him: he’s that guy from TV. You know, the bar bouncer with the genius IQ, like the real-life version of Good Will Hunting? Yeah, that guy.) He contributed an essay on “causality and teleologic evolution” to Uncommon Dissent: Intellectuals who find Darwinism Unconvincing, and the “CTMU” was published in — wait for it — Progress in Complexity, Information and Design (September 2002), the world-renowned journal of ISCID itself.
    Insofar as mere mortals can comprehend, the CTMU resembles poorly digested Platonism; by beginning with “logical tautologies” — or what he calls tautologies — Langan proceeds to derive the whole structure of the Cosmos, subsuming Hubble and Goedel alike in a mishmash of pseudophysics and pseudophilosophy. It’s really a barrel of laughs, and a fine illustration of MarkCC’s maxim that the worst math is no math at all.
    Proving, if we ever needed more proof, that IQ is as IQ does.

  6. Daniel

    I wouldn’t say he’s a retard, but AFAICT he doesn’t have any grasp of advanced mathematics beyond the absolute basics of information theory

    No, no more of a retard than any other young earth creationist, which he has admitted to being, nor are flat-earthers retarded either. He’s just very, very assertive on topics that he’s quite ignorant about.

  7. Coin

    You’d think we could just ask Dembski to clarify. He seems to spend an enormous amount of time lately posting on his blog. But somehow he doesn’t seem terribly interested in answering relatively simple questions about what the words he uses even mean…

  8. Torbjörn larsson

    This is the one argument of Mark that I don’t find entirely convincing, since “simultaneously high and low K-C information content” could mean searching an intermediate optimum.
    That is also similar to how one would expect such complexities to behave. For instance neural complexity ( ) are such. It reminds of glassy states with order on all distances, situated between ordered crystals and unordered amorphous states.
    At this point I would need some more convincing besides the obvious that Sal and Dembski doesn’t know what they are saying, and try to weasel around by constatntly changing definitions. Perhaps it could work. (That isn’t enough, though: there are subtractively irreducibly complex systems. It’s just that evolution has no problem making them.)
    BTW, Sal is up to the usual quotemining. The collection “The Mind’s I” is a more than 20 year old collection of essays “divided into six sections, each focusing on a particular aspect of the problem of self” ('s_I ). The citate from Morowitz was probably tendentious already then, mentioning the old anthropomorphic quantum mechanic observer. I think it is fair to say that the work on decoherence has made it possible to see any interacting classical object as the observer.
    That eliminates the quantum woo interpretations that was expressed by the citation from Wigner, which supposedly was due to his late interest in Vedanta philosophy ( )
    (I don’t like Wigner’s philosophies and how they are treated. The essay “The Unreasonable Effectiveness of Mathematics in the Natural Sciences”, which is provoking with good questions but few answers besides a similar appeal to consciousness as a separate phenomena, are used, by platonists especially, without regarding that it doesn’t treat counterarguments.)

  9. Mark Chu-Carroll

    The problem with specified complexity is that Dembski claims that you can precisely identify certain designed entities by the fact that they have this property of specified complexity. And further, the property of “specified complexity” is actually two properties which can be recognized *separately*: complexity, and specifications. Finally, they assert that *only* things that are designed will possess these two properties simultaneously.
    And that’s the problem: the requirement that these two properties be *separately identifiable*. Specified complexity doesn’t say “KC complexity at a local maximum/minimum”. It says “Complexity = High KC complexity” – that is, an entity/string has the complexity property of SC if it has high K-C complexity. It also says “Specification = low K-C complexity” – an entity/string has the specification property if it can be precisely described using very little information. So it’s saying something has specified complexity is something which has lots of uncompressible information (high K-C complexity), but which doesn’t have much information (low K-C complexity).

  10. Torbjörn Larsson

    Ah, now I think I see your point! Thanks!
    Clearly I have conceptual difficulties with this. This made me disregard the requirement of separately identifiable.
    Perhaps if I make an analogy: An object may be hot (high temperature) or cold (low temperature). It can also be both hot and cold at the same time (medium temperature). That should not be a problem.
    But an IDist is trying to say: “I can clearly see that it is hot. And I can clearly see that it is cold…” Because those are the only alternatives they claim they can identify.
    At least I think that is the point.
    Maybe they will come up with a way to measure ‘temperature’, but I wouldn’t hold my breath. And even if they did, it should be like IC: if it is a real biological property, evolution works and it will have produced it. So I have no worries there.

  11. PiGuy

    This is awesome! I followed the original post back in the summer (although I apparently stopped before the hecklers-who-aren’t-mathemticians-but-know-Dembski’s-math-is-correct showed up) and had some difficulty but I now get exactly what you’re getting at.
    Armed with my new understanding of IT I plan to spend the rest of the afternoon reading scholarly math articles on Claude Shannon et al. Hey – what’s federal funding for?

  12. Coin

    This is the one argument of Mark that I don’t find entirely convincing, since “simultaneously high and low K-C information content” could mean searching an intermediate optimum.

    But even if this were what the IDers thought they were trying to say, though, would it make any sense? CSI is frequently spoken of as being not just a property which is present or not present, it’s treated as something which is a measurable quantity. IDers constantly make reference to there being a certain number of “bits” of CSI, and not only does Dembski offer what he seems to think is a formula for calculating the quantity of CSI in something, but his imaginary “fourth law” refers to increases and decreases in this quantity.
    If CSI actually is a measurement of high and low K-complexity at once– and MarkCC does seem to be making a pretty good case that it is, whether intentionally or not– then does it make any sense at all to talk about how many “bits” of CSI we have? Meanwhile if this “both compressible and uncompressible” definition does accurately describe CSI yet CSI is still a measurable quantity at all, then doesn’t this imply there actually would be some specific optimal K-complexity at which CSI as defined by Dembski finds a maximum? I’m unconvinced that this is a desirable or remotely sane quality for a mathematical measurement of “designed-ness” (as Dembski’s CSI purports to be) to have; but nevertheless, I’d be quite curious to plug in Dembski’s CSI-finding equation and find out what this optimal K-complexity would be. (After all, if we did this, we would be able to calculate the exact identity of The Most Designed Object In The Universe. I predict that this will turn out to be a copy of the source code to VMS.)
    Unfortunately as far as I’m aware there’s only one practical example ever given by Dembski of actually applying his CSI-measuring equation, and that one example (his calculation of the CSI in a flagellum) seems to incorporate several extremely subjective and non-mathematical concepts (like, what is the information content of the English word “propeller”), so how to actually use Dembski’s equation seems to be something of a mystery…

  13. Torbjörn Larsson

    I’m also at a loss how ID’s different ideas connect.
    For example, IC isn’t ID, since natural systems achieve it. In fact there is a recent Panda’s Thumb article that reveals that a similar concept was thought up by a biologist early 20th century and introduced in the middle of it as interlocking complexity 1939. With forerunners back to mid-1600s. ( )
    Interlocking complexity follows from predictions how evolution behaves and is used for such stuff as to suggest that individuals must be quite complex (DNA & cellular machinery) even as embryos. Quite the opposite from what its relation IC was meant to do.
    And IC is falsifiable (and has been so proven) which is something ID tries to avoid otherwise. It has also no connection to other ideas of ID what I know of, except perhaps imagined by Dembski in connection with his CSI growth fabulations. IC must have been an early attempt.
    Neural complexity by Tononi et al that I mentioned above may or may not be measurable in situ – but they have used it in neural net models to show that it suggestively controls how the network behaves. How and if they proceed I haven’t checked.
    But your description shows that Dembski et al isn’t up to such simple tasks. How amazing…


Leave a Reply