# Second Law Silliness from Sewell

So, via Panda’s Thumb, I hear that Granville Sewell is up to his old hijinks. Sewell is a classic creationist crackpot, who’s known for two things.

First, he’s known for chronically recycling the old “second law of thermodynamics” garbage. And second, he’s known for building arguments based on “thought experiments” – where instead of doing experiments, he just makes up the experiments and the results.

The second-law crankery is really annoying. It’s one of the oldest creationist pseudo-scientific schticks around, and it’s such a terrible argument. It’s also a sort-of pet peeve of mine, because I hate the way that people generally respond to it. It’s not that the common response is wrong – but rather that the common responses focus on one error, while neglecting to point out that there are many deeper issues with it.

In case you’ve been hiding under a rock, the creationist argument is basically:

1. The second law of thermodynamics says that disorder always increases.
2. Evolution produces highly-ordered complexity via a natural process.
3. Therefore, evolution must be impossible, because you can’t create order.

The first problem with this argument is very simple. The second law of thermodynamics does not say that disorder always increases. It’s a classic example of my old maxim: the worst math is no math. The second law of thermodynamics doesn’t say anything as fuzzy as “you can’t create order”. It’s a precise, mathematical statement. The second law of thermodynamics says that in a closed system:

$Delta S geq int frac{delta Q}{T}$

where:

1. $S$ is the entropy in a system,
2. $Q$ is the amount of heat transferred in an interaction, and
3. $T$ is the temperature of the system.

Translated into english, that basically says that in any interaction that involves the
transfer of heat, the entropy of the system cannot possible be reduced. Other ways of saying it include “There is no possible process whose sole result is the transfer of heat from a cooler body to a warmer one”; or “No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work.”

Note well – there is no mention of “chaos” or “disorder” in these statements: The second law is a statement about the way that energy can be used. It basically says that when
you try to use energy, some of that energy is inevitably lost in the process of using it.

Talking about “chaos”, “order”, “disorder” – those are all metaphors. Entropy is a difficult concept. It doesn’t really have a particularly good intuitive meaning. It means something like “energy lost into forms that can’t be used to do work” – but that’s still a poor attempt to capture it in metaphor. The reason that people use order and disorder comes from a way of thinking about energy: if I can extract energy from burning gasoline to spin the wheels of my car, the process of spinning my wheels is very organized – it’s something that I can see as a structured application of energy – or, stretching the metaphor a bit, the energy that spins the wheels in structured. On the other hand, the “waste” from burning the gas – the heating of the engine parts, the energy caught in the warmth of the exhaust – that’s just random and useless. It’s “chaotic”.

So when a creationist says that the second law of thermodynamics says you can’t create order, they’re full of shit. The second law doesn’t say that – not in any shape or form. You don’t need to get into the whole “open system/closed system” stuff to dispute it; it simply doesn’t say what they claim it says.

But let’s not stop there. Even if you accept that the mathematical statement of the second law really did say that chaos always increases, that still has nothing to do with evolution. Look back at the equation. What it says is that in a closed system, in any interaction, the total entropy must increase. Even if you accept that entropy means chaos, all that it says is that in any interaction, the total entropy must increase.

It doesn’t say that you can’t create order. It says that the cumulative end result of any interaction must increase entropy. Want to build a house? Of course you can do it without violating the second law. But to build that house, you need to cut down trees, dig holes, lay foundations, cut wood, pour concrete, put things together. All of those things use a lot of energy. And in each minute interaction, you’re expending energy in ways that increase entropy. If the creationist interpretation of the second law were true, you couldn’t build a house, because building a house involves creating something structured – creating order.

Similarly, if you look at a living cell, it does a whole lot of highly ordered, highly structured things. In order to do those things, it uses energy. And in the process of using that energy, it creates entropy. In terms of order and chaos, the cell uses energy to create order, but in the process of doing so it creates wastes – waste heat, and waste chemicals. It converts high-energy structured molecules into lower-energy molecules, converting things with energetic structure to things without. Look at all of the waste that’s produced by a living cell, and you’ll find that it does produce a net increase in entropy. Once again, if the creationists were right, then you wouldn’t need to worry about whether evolution was possible under thermodynamics – because life wouldn’t be possible.

In fact, if the creationists were right, the existence of planets, stars, and galaxies wouldn’t be possible – because a galaxy full of stars with planets is far less chaotic than loose cloud of hydrogen.

Once again, we don’t even need to consider the whole closed system/open system distinction, because even if we treat earth as a closed system, their arguments are wrong. Life doesn’t really defy the laws of thermodynamics – it produces entropy exactly as it should.

But the creationist second-law argument is even worse than that.

The second-law argument is that the fact that DNA “encodes information”, and that the amount of information “encoded” in DNA increases as a result of the evolutionary process means that evolution violates the second law.

This absolutely doesn’t require bringing in any open/closed system discussions. Doing that is just a distraction which allows the creationist to sneak their real argument underneath.

The real point is: DNA is a highly structured molecule. No disagreement there. But so what? In the life of an organism, there are virtually un-countable numbers of energetic interactions, all of which result in a net increase in the amount of entropy. Why on earth would adding a bunch of links to a DNA chain completely outweigh those? In fact, changing the DNA of an organism is just another entropy increasing event. The chemical processes in the cell that create DNA strands consume energy, and use that energy to produce molecules like DNA, producing entropy along the way, just like pretty much every other chemical process in the universe.

The creationist argument relies on a bunch of sloppy handwaves: “entropy” is disorder; “you can’t create order”, “DNA is ordered”. In fact, evolution has no problem with respect to entropy: one way of viewing evolution is that it’s a process of creating ever more effective entropy-generators.

Now we can get to Sewell and his arguments, and you can see how perfectly they match what I’ve been talking about.

Imagine a high school science teacher renting a video showing a tornado sweeping through a town, turning houses and cars into rubble. When she attempts to show it to her students, she accidentally runs the video backward. As Ford predicts, the students laugh and say, the video is going backwards! The teacher doesn’t want to admit her mistake, so she says: “No, the video is not really going backward. It only looks like it is because it appears that the second law is being violated. And of course entropy is decreasing in this video, but tornados derive their power from the sun, and the increase in entropy on the sun is far greater than the decrease seen on this video, so there is no conflict with the second law.” “In fact,” the teacher continues, “meteorologists can explain everything that is happening in this video,” and she proceeds to give some long, detailed, hastily improvised scientific theories on how tornados, under the right conditions, really can construct houses and cars. At the end of the explanation, one student says, “I don’t want to argue with scientists, but wouldn’t it be a lot easier to explain if you ran the video the other way?”

Now imagine a professor describing the final project for students in his evolutionary biology class. “Here are two pictures,” he says.

“One is a drawing of what the Earth must have looked like soon after it formed. The other is a picture of New York City today, with tall buildings full of intelligent humans, computers, TV sets and telephones, with libraries full of science texts and novels, and jet airplanes flying overhead. Your assignment is to explain how we got from picture one to picture two, and why this did not violate the second law of thermodynamics. You should explain that 3 or 4 billion years ago a collection of atoms formed by pure chance that was able to duplicate itself, and these complex collections of atoms were able to pass their complex structures on to their descendants generation after generation, even correcting errors. Explain how, over a very long time, the accumulation of genetic accidents resulted in greater and greater information content in the DNA of these more and more complicated collections of atoms, and how eventually something called “intelligence” allowed some of these collections of atoms to design buildings and computers and TV sets, and write encyclopedias and science texts. But be sure to point out that while none of this would have been possible in an isolated system, the Earth is an open system, and entropy can decrease in an open system as long as the decreases are compensated by increases outside the system. Energy from the sun is what made all of this possible, and while the origin and evolution of life may have resulted in some small decrease in entropy here, the increase in entropy on the sun easily compensates this tiny decrease. The sun should play a central role in your essay.”

When one student turns in his essay some days later, he has written,

“A few years after picture one was taken, the sun exploded into a supernova, all humans and other animals died, their bodies decayed, and their cells decomposed into simple organic and inorganic compounds. Most of the buildings collapsed immediately into rubble, those that didn’t, crumbled eventually. Most of the computers and TV sets inside were smashed into scrap metal, even those that weren’t, gradually turned into piles of rust, most of the books in the libraries burned up, the rest rotted over time, and you can see see the result in picture two.”

The professor says, “You have switched the pictures!” “I know,” says the student. “But it was so much easier to explain that way.”

Evolution is a movie running backward, that is what makes it so different from other phenomena in our universe, and why it demands a very different sort of explanation.

This is a perfect example of both of Sewell’s usual techniques.

First, the essential argument here is rubbish. It’s the usual “second-law means that you can’t create order”, even though that’s not what it says, followed by a rather shallow and pointless response to the open/closed system stuff.

And the second part is what makes Sewell Sewell. He can’t actually make his own arguments. No, that’s much too hard. So he creates fake people, and plays out a story using his fake people and having them make fake arguments, and then uses the people in his story to illustrate his argument. It’s a technique that I haven’t seen used so consistency since I read Ayn Rand in high school.

## 55 thoughts on “Second Law Silliness from Sewell”

1. Mike Haynes

Great post! I have never been challenged with this particular flavor of nonsense before.

I usually get the argument from the first law, “Energy can be changed from one form to another, but it cannot be created or destroyed.” from people trying to use it to “prove” that consciousness continues in a different form of “energy” after death.

Arguing the correct definition of energy and the application of the first law usually gets me accused of being a “know-it-all” who thinks he has all the answers. sigh…

2. Tim

Mark, (Mike,)

Mark, I’m up to where you said, “But the creationist second-law argument is even worse than that.” I’m gonna start my reply now, finish reading after I had said what needs saying regarding the previous. I *mostly* agree with you so far.

you said, “Note well – there is no mention of “chaos” or “disorder” in these statements: The second law is a statement about the way that energy can be used. It basically says that when you try to use energy, some of that energy is inevitably lost in the process of using it.”

RE: first sentence, pre-colon: yes, that “disorder” word has been a terrible bugaboo.

RE: first sentence, colon-aft: you too are drifting to bugaboo. The second law was derived from consideration of “the way that energy can be used”, and is descriptive of a hard constraint regarding “energy use”, but it is NOT “a statement about THE WAY that energy can be used.”

RE: sentence two: your handwaving – if I may – crashes when you say “energy is … lost”! The thermodynamics of Gibbs, as Mike points out, also has the first law, about the conservation of energy. Whether in the form of heat, or as entropy.

a bit of a backtrack & aside, when you offered two example statements of the second law:

” “There is no possible process whose sole result is the transfer of heat from a cooler body to a warmer one”; or “No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work.” ”

I wonder… this sounds like the kind of handwavy “bullshit” you rail against elsewhere, and I wonder if you would have been able to see the math if you had not 150 years worth of it to look at! I mean, Mark, when Gibbs formulated the second law, the great range of its applicability, and the hard math it suggests, simply would not have been fathomed by just anyone. anyway, …

you say, “Entropy is a difficult concept. It doesn’t really have a particularly good intuitive meaning. ”

A long time after Gibbs, the statistical mechanical derivation provided – precisely – “a particularly good intuitive meaning”!:

Entropy is (proportional to) the natural logarithm of the number of ways of arranging the system! (Do you see?)

you said, “Once again, if the creationists were right, then you wouldn’t need to worry about whether evolution was possible under thermodynamics – because life wouldn’t be possible.”

this was nice. But take heed!

you went on, “Once again, we don’t even need to consider the whole closed system/open system distinction,”

I sense that you are not being properly precise vis-a-vis “open” and “closed”. The point I have been working up to, – in this first part, – is to make you admit that any consideration of the second law involves a defining a “system”. It is defined in reference to its “surroundings”. And these two are presumed to constitute a “universe”. Mark, we are talking about the TRANSFER of heat, right? … Then we get back to my – very important – fine point about “the way energy can be used”, and my BIG point about the fact that science has never been able to advance thought one about this “the way”, but only to DESCRIBE – ever more precisely – a hard boundary of REAL that is lurking – but somewhere – deep down. Science keeps having success at saying REAL cannot be (too far) outside the limits they have seen, and the models they have explained, but the hope of success is still just hope.

To your examples… like the cell. All you have is description. But the process of animation… ? It is not questioned. Or at least all questionings end in a simple faith/e: I know it DOES work – somehow. (on the CTMU thread John likes to rail against “creationists” for falling into “God of the Gaps”. But so does he. And Shadonis says “evolution works naturally”. Great: what is meant by “naturally”: handwavy bullshit is the fundament of that side [I do not say “too”].)

Mark, that you point out that the “creationists” fail, that they fall prey to materialism (that they believe in “the ‘universe’ “), is fine – even very nice! You point out that they fail at the very beginning!!! That they would deny “life”, from the foundation. But your failure is little better! Merely, you want to see a final – highly formal – picture of it. Of failure.

The “cell” cannot “work” if it has no “surroundings” to relieve itself of ITS waste. Langan goes after the dual. Very impressively. But he, too, fails to find a proper – REAL – bounding.

The bounding is infinite!

“I am” is the fundament, the infinite bounder, and the perspective/pedestal of any&all finite i’deal RELATIONS!

“I am” is conserved.

THEE i’dea is conserved!

IT is a source of infinite potential energy! And all of science cries out that that potential is REAL. That there ARE real restrictions as to *how* REAL … remains REAL. Yet can evolve!!! Co-operatively!!!

But the boundaries of the operatives is not to be found IN any such “universe”! One simply cannot point to an INTERNAL “system” and say that THAT is an “I am”.

Again, that entropy is never decreasing is evidence … for a lot! That time cannot go backwards (choices cannot be erased). That we are evolving: all this “waste” we create is not created for naught! That the “waste” we create is actually useful – from “above”!!! And that there is an infinite potential in that reserve!!!!!

And, that it is precisely that potential from “above” that is tapped by an “I am” in WILLING i’deal change (via finite restriction within ones OWN infinite “surroundings”). It is the will of individual “I am” that makes the clock tic!!! (There is an ultimate clock rate! You wanted a testable prediction, right?) But, of course, to tap that well of energy one must conform to (i’deal) LAW. “Machinery” simply has NO access to draw from that “well”!

Moving on:

you say, “In the life of an organism, there are virtually un-countable numbers of energetic interactions, all of which result in a net increase in the amount of entropy.”

But Mark, an “organism” is not alive! It is just the “machinery” for providing (an every better) perspective. A perspective for that which is “life”, thee i’dea! “I am” animate the “machinery”!!!!! (I really do love the highly developed “machinery”, Mark, but I don’t *overvalue* IT.)

you said, “one way of viewing evolution is that it’s a process of creating ever more effective entropy-generators.”

I think this is a gem! Meditate on it, maybe. What is the nature of the first/ultimate entropy-generator!? 🙂

Tim

1. MarkCC Post author

Hey Tim:

1. Tim

Mark,

I’d kinda like to leave both! (And I think that I have gotten myself out of the prior, at least for the time being.) Anyway, I thought that you might be able to see how stupid the idea that a “system” is a real thing is with this one little thing.

I’ll take my leave then,
thanks again for hosting,
Tim

1. Tim

anyone,

For Mark’s sake, if anyone becomes interested in what I’ve said, try to get my attention in the annoying CTMU thread.

Tim

[[[[” “There is no possible process whose sole result is the transfer of heat from a cooler body to a warmer one”; or “No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work.” ”

I wonder… this sounds like the kind of handwavy “bullshit” you rail against elsewhere, and I wonder if you would have been able to see the math if you had not 150 years worth of it to look at!]]]]

Dude, Tim, you are full of shit, lmao.

It’s not handwavy bullshit. It’s physics. You can’t have a 100%-efficient engine. We can’t have a cold reservoir at 0 Kelvin.

1. Tim

oh my God, my whole point was precisely to show you that that which looks like handwavy bullshit, isn’t necessarily the math-free handwavy bullshit you think it is. If you had been around in 1850 and Gibbs had come up to you and said “no process is possible whose sole result is the transfer of heat from a cooler to a warmer region”, the likes of the crank-finders from back then would have been all a twitter with saying that’s handwavy bullshit. I said that so that maybe you would see that not all handwavy bullshit is handwavy bullshit. It doesn’t look like handwavy bullshit now because you have already seen how far that handwavyness actually can go. But the depths of my “handwavyness”, perhaps in 150 years.

again, I was merely affirming that it was NOT terrible handwaving. But, if you think it totally free of handwavy nature, perhaps you will tell us what temperature is?!

jackass,
Tim

1. Tim

the wiki page starts, “Temperature is a physical property of matter…”

precise enough for a handwave junkie, but not for me 🙂

also, Mark’s post says “energy is lost”, kinda contradicting the first law, as I said.

Tim

Okay, then go take a physics/chemistry class so you can better understand how temperature is measured and defined then.

2. Tim

I have already told you that I am an “expert” in this field.

The big question is not to stick a thermometer into a MACROSCOPIC “system” (and the science of thermodynamics is, as I’ve said before, limited to macroscopic & EQUILIBRIUM!!! systems), but to understand energy (temperature) fundamentally. From first principle. That is what I have done for you with my metaphysic! But…

Tim

2. Tim

“When a body is not in a steady state, then the notion of temperature becomes even less safe than for a body in a steady state not in thermodynamic equilibrium. This is also a matter for study in non-equilibrium thermodynamics.”

Tim

2. Tim

(real quick, Mark, thanks)

you said, “you can’t have a 100%-efficient engine. We can’t have a cold reservoir at 0 Kelvin.”

right. These I affirmed with every word I said. Any real relation creates waste, thus no 100% efficiency (and thus the need for any real system to have ITS surroundings). For the impossibility of absolute zero, I will let you hang (as if FROM your cross {tree}) until you offer me something towards “temperature”? To much handwavyness implicit in there; you haven’t met your standards of precision.

(response should go where, Mark?)
Tim

I don’t think you understand what it means to handwave something away. Talking about “temperature” isn’t handwavy. It’s a well-understood and well-defined concept.

1. Tim

first, why do you talk only of handwaving “away”? This is the problem, you handwave something IN, but that’s cool, right?

Temperature is far from being “well-understood”! It only seems that way (TO YOU) because that’s where we get our hooks into the open question of Energy (to be sure, the natural units of temperature is energy, and the natural units of entropy is “dimensionless”), and because you aren’t honest about your limitations.

Tim

1. Tim

S is (proportional to) the natural logarithm of the number of ways of arranging “the system” (and the “universe”)

The number of ways is a dimensionless NUMBER.

The natural log of that dimensionless number is also dimensionless.

Historically, temperature (of a macroscopic and equilibrium “system”) was known well before entropy. It was measured in units of “degree”. So, for a certain continuity, temperature is kept in units of degree, and entropy is multimplied by a proportianality constant (boltzmann’s), which has the units of energy per degree.

“tik-tok on the clock but the party don’t stop” – Ke\$ha

Tim

2. MarkCC Post author

I don’t give a damn – just not here.

I’ll sum up the problem with your whole argument – and it’s exactly why you belong over in your bullshit metaphysics thread.

You insist on playing with words, and then exploiting the inherent ambiguity in them. The problem isn’t that the concepts are ill defined – but that you’re taking vague non-mathematical definitions of precise mathematical concepts, and then pretending that the problems with the imprecise definitions are the problem of the defined quantity.

What’s the temperature of an object? It’s a statistical measure of the kinetic energy in that object. It can be defined incredibly precisely – but that precise definition is a mathematical description of the statistical measure.

Similarly, you’re playing around with the intuitive description of entropy. It’s damned hard to come up with a valid, intuitive, precise description of entropy. But it’s really, really easy to state it mathematically – as I did in the post.

Handwaving is when you refuse to actually do things like define your terms precisely in any way. That’s not a problem with the science of thermodynamics. The definitions may not be easy; they may not be intuitive; but they’re very precise.

4. david

When I hear the “second law” canard, I ask them how snowflakes (highly structured, complex forms) are created from disorganized and unstructured water vapor in a cloud, and watch them squirm.

1. idlemind

That’s easy, David. God forms each and every one with his bare hands.

(Strip away all of the pseudo-scientific BS, and that’s about the level of Sewell’s argument.)

5. Mary

Um, I couldn’t make sense of the comments above, so maybe this point was already made, but there are other (mathematical) definitions of entropy besides the classical thermodynamics definition quoted in this post. Specifically, statistical mechanics defines it as S = kB * ln(Omega) — the logarithm of the number of “microstates” that correspond to a given “macrostate” — where a macrostate is defined usually by classical thermodynamic variables like pressure, temperature, and density, and a microstate is defined by the position and momentum of all of the different particles. There are a lot of a different ways to arrange the molecules of gas in a room that lead to the same distribution of pressure and temperature. The more there are that correspond to a given distribution, the more likely that distribution will be the equilibrium state of the room.

Likewise in information theory, which I’m sure you, Mark, know more about than I do, there is a related definition of the entropy of a string that has to do with the logarithm of the probability of creating that string by selecting symbols from your symbol set at random. It defines the “information content” of a string in sense that has nothing to do with meaning. Static on the radio would have high “information entropy”, but a single tone would have low entropy, though neither has any meaning. The static *could* encode a message, wheras the single tone can’t encode anything.

Anyway, I think this Sewell fellow is conflating the three definitions. The second law doesn’t apply to the information theory type of entropy, which is really a different mathematical quantity that just has the same name because it’s analgous.

The connection between entropy and disorder comes from the statistical mechanics definition. Usually people make the connection by pointing out that there are a lot of ways for a room to be “messy” — the pile of laundry can be here or there, but it’s still a mess — and fewer ways for it to be neat. A lot of ways for leaves to be scattered across the lawn (you could switch the position of this leaf and that and they would still be “scattered”) etc. So “disordered” macrostates tend to have higher entropy (more “microstates” that are equivalent in terms of the macro description) than “ordered” macrostates.

Disorder in this sense is created by equilibrium processes. Maybe the leaves start out densely piled against the fence, but all things being equal, any changes from that state are likely to be in the direction of “scattered across the lawn” and eventually that’s how they are likely to wind up.

Order is created when this symmetry is broken. If there is a wind in one particular direction, a pile *can* spontaneously create itself against the fence, even if the leaves started out scattered. The wind is the result of a pressure differential. Likewise, if there is a force that is always attractive and never repulsive, celestial dust can compose itself into planets. The planets form due to a differentials in gravitational potential.

The sun, the moon and tides, the mountains and oceans, the cycle of day and night, the water cycles, these create lots of pressure and temperature and potential energy gradients, lots of broken symmetries that drive the creation of ordered states on earth.

But still the equilibrium processes are working to erase those gradients. Mountains erode. Winds blow and equalize pressure. Tidal friction is slowing the Earth’s rotation. Eventually there will be no more gradients, because everything will be at equilibrium, and no more “order.”

In the mean time, though, ask this Sewell guy if tordados can create leaf piles? And have him think about why that isn’t a violation of the second law of thermodynamics?

Okay, this comment got long. I guess this is the stuff I thought this post was going to go over when I saw the title. I think you could have probably explained these ideas better…

1. eric

Mary: The connection between entropy and disorder comes from the statistical mechanics definition.

No, it’s still a mistake. The Boltzmann form of entropy is talking about available energy microstates, not spatial position microstates. Creationist’s linking of the 2nd law directly to the notion of spatial order is a misinterpretation of both the classical and Boltzmann definitions of it.

6. Brandon Wilson

Discussions about the second law do seem prone to ending up in death spirals around fuzzy definitions and words. 🙁

For a cool and rigorous approach to understanding the second law, those interested might look around for discussions on Liouville’s Theorem. The second law is a corollary. This particular approach arises from (Hamiltonian) mechanics and I found it to aid in a very concrete understanding of the second law.

Maybe we are teaching the second law incorrectly?

7. John Fringe

There is a very funny and great scene in “A serious guy” from the Coen brothers which I think it presents very well the problem:

Larry Gopnik: So, uh, what can I do for you?
Clive Park: Uh, Dr. Gopnik, I believe the results of physics mid-term were unjust.
Larry Gopnik: Uh-huh, how so?
Larry Gopnik: Uh, yes. You failed the mid-term. That’s accurate.
Clive Park: Yes, but this is not just. I was unaware to be examined on the mathematics.
Larry Gopnik: Well, you can’t do physics without mathematics, really, can you?
Clive Park: If I receive failing grade I lose my scholarship, and feel shame. I understand the physics. I understand the dead cat.
Larry Gopnik: You understand the dead cat? But… you… you can’t really understand the physics without understanding the math. The math tells how it really works. That’s the real thing; the stories I give you in class are just illustrative; they’re like, fables, say, to help give you a picture. An imperfect model. I mean – even I don’t understand the dead cat. The math is how it really works.

(I was very surprised to see this in a movie. Coen brother’s are great 🙂 )

I think the Second Law is just taught poorly. A huge number of people think it’s simply “order falls to disorder over time.”

9. j a higginbotham

Some questions:

1) “Q is the amount of heat transferred in an interaction” – does it matter which direction the heat flow is in (if there is a direction)?

2) “system”, that is, can heat be transferred out of the system, or would that make whatever gets the heat part of the system?

3) why does all the air not move to one half of the room; is that entropy or some other rule?

4) is the Boltzmann form of entropy really just a statistical concept? that things are more likely to be in the more common energy states but have a low (extremely low) possibility of being in some less statistically likely state?

thanks
jah

1. MarkCC Post author

(1) In any thermodynamic interaction, the net effect is always that heat(energy) always flows from warmer objects to colder ones.

(2) A system is just any collection of objects that you want to observe. In general, for thermodynamic analysis, you stick to closed systems – that is, systems where no energy flows in or out.

(3) It’s related to entropy. Roughly speaking, the entropy of a system is a combinatorial measure of the number of energetic states of all of the entities in the system. In a room with a lot of particles, a state where all of the particles are on one side of a room has a massively smaller amount of entropy than a state where the particles are scattered randomly. (One way of thinking about it is just to look at a smaller combinatorial system. Consider the number of permutations of a set of 10 numbers (3,628,800 possible permutations); compare that to the number of permutations if you restricted it so that the numbers 1 through 5 must come before the numbers 6 through 10. (14,400 permutations). Only instead of 10 values, in a room, we’re talking about a couple of dozen orders of magnitude more particles; and instead of just a 1-dimensional permutations, we’re looking at a three dimensional space, and each particle has both a position and a velocity.)

(4) That’s a very hard question; entire books have been written about the answer. So I’m not even going to try to answer it in a blog comment.

10. j a higginbotham

Thanks for the explanations. I have never understood this (and still don’t) but every bit helps.

1) It seems to me there are two types of natural law. One is an equation such as F=ma. [But is this an equality or only a first order approximation?] The second is statistical/probability where there is a wider range of acceptable possibilities but only the most probable (all the air molecules are mostly evenly distributed, CO crystals have entropy 1/2 even at oK).

2) A common creationist claim is that rust is an example of entropy at work: http://creationoutreach.com/id3.html “But, the iron or steel (unless its stainless steel) will automatically rust back into iron ore without any human action at all. Entropy, that is the 2nd Law, causes everything to rust, rot, run down, get old, get disarranged, automatically. ”

I was wondering what the entropy change for this reaction is, which depends on the definition of rust which is presumably a mixture of hydrated iron oxides. Recently I found this which is not bad although they give no actual numbers despite using a simplified system: http://physicsandphysicists.blogspot.com/2010/11/rust-and-entropy.html

“As Styer pointed out, rust occurs when two types of elements interact, i.e. iron and oxygen, via the reaction 4Fe + 3O2 –> 2Fe2O3. Now, one can assume that Fe is in a solid form, while oxygen is in a gaseous form. In general, it is clear that a gas will have a higher entropy than a solid. So what is going on here is that we have an initial condition of a solid iron and a gaseous oxygen, and ending up with a solid rust (iron oxide). This final condition should actually have a lower entropy than solid iron plus oxygen gas. So the entropy of this system should have decreased, not increased as claimed in the cited references.”

1. keithb

“Entropy, that is the 2nd Law, causes everything to rust, rot, run down, get old, get disarranged, automatically. ”

I think this is false on its face. Leave a chunk of gold or diamond around and it will not “decay” or combine with anything else.

1. j a higginbotham

Actually, according to entropy, diamonds are unstable on the earth’s surface due to lower pressure. But the conversion to the more stable graphite has a high barrier.

11. Peter

Nitpick: When people talk the second law only applying to closed systems, they mean that you have to add up all the entropy changes due to all the energy transfers. If you can draw a box around the system and say no energy is transferred through the box (even if the “box” has infinite extent), then it’s a closed system, and second law applies.

That is, the open/closed systems thing isn’t another point that you haven’t had to deal with, it’s actually the exact same as your second point: that the second law applies only to total entropy changes.

1. Peter

Oh, so for example, if you treat the Earth as a closed system, then life would be quite magical and violate the 2nd law of thermo, because you’d be neglecting the sun as the ultimate source of energy of most of the food chain.

12. Tim Gaede

Crank: “I’ve built an engine that is more than one hundred percent efficient!”

Creationist: “Don’t be ridiculous. An engine cannot be more than zero percent efficient.”

13. Argon

I don’t think Sewell understands that a stack of playing cards arranged in order of suit and rank has the same thermodynamic entropy a stack with a random order of cards. He can test this by burning the two stacks and seeing if there is any different in the heat released.

14. SWT

MarkCC, I think Sewell’s article has a lot more bad math than you give it credit for … “Entropy = – Order” only scratches the surface. The PT discussion you mention has a number of posts by Mike Elzinga and others (including me) that identify mathematical errors based on unstated (and apparently unrecognized) assumptions, inappropriate generalization, and what appears to be a complete lack of understanding of the (dare I say it?) genesis of the few equations he does use … all this in addition to some quite creative errors about the nature of entropy and the nature of second law constraints.

1. MarkCC Post author

Oh, yes, there’s no doubt that it definitely does. I wasn’t trying to be exhaustive, but rather to focus on the one specific issue that I thought had been neglected.

15. SWT

To expand on a couple of points:

The distinction between open and closed systems has to do with whether or not matter can cross the system boundary. A system in which neither energy nor matter can cross the boundary is isolated.

Thus, if you treat the Earth as a closed system (which is not a bad approximation in our era), you still get to consider the enormous energy flow through the biosphere, so life isn’t at all magical. If we were somehow cut off from that energy flow, the Earth would be an isolated system and things would get pretty bad pretty quickly.

The second law applies to open systems, closed systems, and isolated systems — the proper way to apply the second law depends on the boundary conditions. Creationists never seem to demonstrate any understanding of this when discussing evolution or abiogenesis.

1. Peter

Eww, semantic argument:

http://en.wikipedia.org/wiki/Thermodynamic_system#Isolated_system
I can’t find my old thermo text, but wikipedia says that the second law is only true for “isolated” systems. Besides, you can’t generally pick some things that are “matter” and some that are “energy” and treat them to different rules, since matter and energy are often thermodynamically coupled, like when doing solid state physics (coupling between the electrons and the phonon gas) or explaining the CMB (which was coupled to proton/electron gas until the universe cooled enough that the ions condensed to hydrogen atoms). And sometimes, you get coupling through E = mc^2. And so on. So for a lot of people, “closed” is just colloquial for “isolated.”

Anyway, I haven’t done thermo in a long time, but I think the overall point is just that the second law leaves a whole lot of wiggle room, and if you really wanted to use it to disprove evolution, you’d have to be very thorough with your accounting.

1. SWT

I can’t find my old thermo text, but wikipedia says that the second law is only true for “isolated” systems.

I think Wikipedia is wrong on this point. There is a well-developed theory for applying the second law to both closed, non-isolated systems (using the Helmholtz free energy) and open systems (using the Gibbs free energy) — Gibbs published the framework for this in 1876. Creationists seem, by and large, to be unaware of these principles.

There is also a well-developed theory for applying the second law to non-equilibrium systems — Prigogine won a Nobel prize for his work in this area. Creationists seem pretty much uniformly unaware of this body of work as well. When one of them does stumble across it, they seem unable to understand and apply it correctly.

1. Peter

I don’t think wikipedia is /wrong/ on this point:

First, a generic, open-system form of the second law won’t be an inequality in the total entropy change. For example, you won’t observe an entropy increase in a refrigerator if you don’t include the entropy increase of the exhaust.

Second, there are special* cases where you can neglect the entropy change of the energy carriers, so the concept of a closed (but not isolated) system is useful. But then, there are cases where there’s no way around it: I think in laser cooling the only entropy increase comes from the the coherent laser light scattering into random light, while the matter sample decreases in entropy. And as far as non-equilibrium systems, is it not based on assuming the system can be treated as quasi-static at an infinitesimal scale where each element is approximately isolated?

The distinction between “matter” and “energy” isn’t general, and energy carriers have entropy, and exchange it with matter. So you can’t have a general, quasi-static 2nd law based on the concept of a “closed [to matter but not energy]” system.

Although, I still haven’t done thermo in a long time so I’m probably confused on some of this.

*granted that these special cases are extremely common, and

1. SWT

I don’t think wikipedia is /wrong/ on this point:

If so, engineering thermodynamics classes and textbooks have been in incredibly serious error for quite a while.

First, a generic, open-system form of the second law won’t be an inequality in the total entropy change.

The generic, open-system form of the second law say that the total entropy changes of the system plus its surroundings must be non-negative. This is exemplified by your next statement:

For example, you won’t observe an entropy increase in a refrigerator if you don’t include the entropy increase of the exhaust.

As I noted previously, one typically uses the Helmholtz or Gibbs free energy to deal with closed or open systems, respectively.
___

Non-equilibrium thermodynamics is based on the observation that the local rate of change of entropy is the net effect of local production of entropy and the entropic effects of mass and energy transfer. My go-to book on this is a text by deGroot and Mazur — it’s what I used to learn this; there are probably better, more recent references for this. deGroot and Mazur do go into the statistical foundations of the non-equiliubrium thermodynamic forumulation.

16. Chris

I came across this fascinating post from Tim Harford on Twitter, and I don’t regret it one bit. I’m a business student and I love science. (Which means I’m not religious, as it should for any rational person, because God could not have created the universe if the Big Bang (or an equivalent scientific phenomenon) did, or if evolution created people, this would once again disprove God’s classical non-bs’ed/”handwavy”, highly rationalized and fabricated existence. Anyone who claims that their holy scripture (which firstly isn’t even a credible source to base any beliefs on, let alone ones about the universe and the origin of mankind) advocates for intelligent design (God-guided evolution) where it simply does not say anything about evolution, is rationalizing and coming up with false reasoning to support their faulty thinking process. This problem is very deep, and I will attempt to explain to whoever reads this post that it’s not even Sewell’s fault that he’s unable to sort facts from fallacies.

I don’t mean to change the subject and get into a religious debate, but I feel that an underlying concept driving how the above people argue, if they subconsciously choose (which has conscious manifestations, aka bias, which is most powerful when invisible like sociological ideologies) to support the 2nd law of thermodynamics conflicting with evolution or not, is heavily influenced by their deep beliefs about God.

So if I were the moderator I would strongly suggest a discussion about God and why anyone would believe in it. But more importantly, since actions are only as real as their consequences (the sociological Thomas Theorem) the accuracy of one’s (religious) beliefs don’t necessarily matter, their outcomes do. These include how religion has impacted the world, and if you were to watch the debate I will mention, it would show a great display of expertise on the subject.

ANYWAYS, points that should be mentioned regarding the driving forces that influence the above scientific discussion are: (which were shown in Debate: The World Would Be Better Off Without Religion Intelligence Squared U.S. Debates) 95% of religious people had religious parents. 90% of religious beliefs are of a certain religion depending on what part of the world you’re born. These two points alone combined with knowing about sociology’s habitus, basically states that culture is never “real”, it’s a set of acquired tastes taught through socialization (constant interactions since birth with society). Religion is one of these tastes, if it’s imprinted into the mind of an innocent, impressionable and naive child, then he/she will grow up around that and accept it as fact, even in the face of sheer contradictory evidence. This is why religion exists today.

The theory of evolution is one of the most supported scientific theories, and if anyone dares to prove it wrong, they’d have immense difficulties, but of course rationalizing and coming up with fake arguments/experiments and fake results bypasses this rule because you’re nit-picking on one technicality of one incorrectly understood concept (the 2nd law of thermodynamics). Sewell is no doubt one of these poor, hopeless individuals who will cling to old irrational ways in the face of sure defeat because statistically (probably) he was born into it.

We should all be conscious and analytical about stupidity (in this case subconscious factors that drive irrational arguments) and I think it’s extremely important that people’s irrational religious beliefs are looked at BEFORE looking at anything they say. What they say is most-likely molded to suit their cause, even if it means rationalizing holy scripture or being ignorant of the most supported concept in science.

Lastly I would like to point out that ultimately, Sewell believes in intelligent design, so while it may be somewhat good to point out the flaws in his arguments about the 2nd law of thermodynamics, what we should really be pointing out is that he, himself, believes in evolution. He believes God is able to break the laws of thermodynamics (because he believes all evolution-even done by God, breaks the laws thermodynamics) WHICH IS INCREDIBLY STUPID and it’s a perfect example of how absolutely ridiculous and false religion people’s arguments are. But of course this is only because it’s sensitive to their faith, I’m not saying everything all religious people say is false, only when it comes to issues regarding their religion.

Here’s something, again, that’s more applicable than just talking about symptoms of the underlying cause, “From the mid-1990s, intelligent design proponents were supported by the Discovery Institute, which, together with its Center for Science and Culture, planned and funded the “intelligent design movement”. They advocated inclusion of intelligent design in public school curricula, leading to the 2005 Kitzmiller v. Dover Area School District trial, where U.S. District Judge John E. Jones III ruled that intelligent design is not science, that it “cannot uncouple itself from its creationist, and thus religious, antecedents”, and that the school district’s promotion of it therefore violated the Establishment Clause of the First Amendment to the U.S. Constitution.”

The law basically supports fact and science, so perhaps with time people will choose to respect the law more (which is more ultimately more important than one’s subjective beliefs), and become less delusional and religious. Until then, I suggest avoiding these illogical fools as much as possible because there is seriously no chance for teaching Sewell (for example) that he’s wrong.

In conclusion, don’t worry about or even acknowledge religious people’s absurd arguments involving anything to do with God (which is usually anything about intelligent design/the story of Jesus Christ/Noah’s Ark/Virgin Mary’s pregnancy-which are all impossible), it’s literally complete delusional bullshit (like religion) and does not deserve your attention. Religious people are usually hopelessly closed-minded so they do not merit your concern, but of course it can be advantageous to sort out the fact from the fiction for average joes, and I guess that’s what you did here… Anyways I still hope this post was at least somewhat useful/interesting for you guys.

-Chris

1. Michael

Chris: Science can not, by definition, prove or disprove matters of faith. You seem to be falling into the same trap as Sewell, although from an opposing side of the trap. I have always been a highly rational man, relying on scientific understanding throughout my life. Separate from this, I also happen to be deeply religious and see God’s work in our daily lives everywhere.

Mark: Excellent article. I’ve never been a fan of the branch of study called apologetics, which the article you reference dabbles in. These arguments do nothing to touch hearts or expand knowledge, while purporting to do both. Given your faith *and* science background, I absolutely love reading your posts on subjects like this.

17. Mike Elzinga

This is from Rudolph Clausius in Annalen der Physik und Chemie, Vol. 125, p. 353, 1865, under the title “Ueber verschiedene für de Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie.” (“On Several Convenient Forms of the Fundamental Equations of the Mechanical Theory of Heat.”)

It is also available in A Source Book in Physics, Edited by William Francis Magie, Harvard University Press, 1963, page 234.

(Note: Q represents the quantity of heat, T the absolute temperature, and S will be what Clausius names as entropy)

“We obtain the equation

dQ/T = SS0

which, while somewhat differently arranged, is the same as that which was formerly used to determine S.

If we wish to designate S by a proper name we can say of it that it is the transformation content of the body, in the same way that we say of the quantity U that it is the heat and work content of the body.

However, since I think it is better to take the names of such quantities as these, which are important for science, from the ancient languages, so that they can be introduced without change into all the modern languages, I propose to name the magnitude S the entropy of the body, from the Greek word η τροπη, a transformation.

I have intentionally formed the word entropy so as to be as similar as possible to the word energy, since both these quantities, which are to be known by these names, are so nearly related to each other in their physical significance that a certain similarity in their names seemed to me advantageous.”

Clausius apparently translates η τροπη from the Greek as Umgestaltung and not Umdrehung. However, this doesn’t matter because he modified the word to entropy for the reasons he indicated.

There is nothing about order or disorder in Clausius’s coining of the word to apply to a mathematical expression. This sort of naming of frequently occurring mathematical expressions is common in mathematics and physics.

Clausius is inventing a word that doesn’t carry any “conceptual baggage” that may lead to its misuse. Unfortunately he didn’t anticipate ID/creationists.

18. Mike Elzinga

Here are the basic methods for computing entropy.

Classical thermodynamics :

ΔS = ΔQT

If an amount of heat ΔQ leaves a system at temperature Thigher and enters the environment at a temperature Tlower, then the change in entropy is

ΔS = – ΔQ/Thigher + ΔQ/Tlower.

Dividing the same amount of heat by a smaller temperature gives a larger number. Therefore the entropy lost by the system is smaller than the entropy gained by the environment. In other words, the overall entropy has increased.

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Heat flows from higher temperatures to lower temperatures.

Creationists need to explain this fact. Why does that happen? What is temperature?

Entropy in statistical mechanics :

S = k ln Ω

Where Ω is the number of accessible energy microstates consistent the macroscopic state of the system, and k is Boltzmann’s constant.

It comes down basically to a process of enumeration; counting those microscopic mechanisms that actually participate in “soaking up” the total energy flowing into or out of the macroscopic system. For relatively homogeneous systems, this is not difficult; but for heterogeneous systems it can become very difficult. One ultimately wants this calculation of the entropy to agree with the classical calculation. So one doesn’t count the nuclei of atoms if only the vibrations or translational motions of the atoms are carrying the energy, or if the electrons are simply changing levels on the order of a few electron volts but the nuclei are not participating in these energy exchanges.

More generally,

S = – k Σ pj ln pj

where pj is the probability that the system is in the jth microstate.

If all microstates are equally probable, pj = 1/Ω and this formula reduces to the one above.

If all the constituents of the system are allowed to interact and exchange energy with each other (matter interacts with matter), then in and isolated system, those probabilities will become equal and the expression will become maximized.

19. Joe

“It doesn’t say that you can’t create order. It says that the cumulative end result of any interaction must increase entropy. Want to build a house? Of course you can do it without violating the second law. But to build that house, you need to cut down trees, dig holes, lay foundations, cut wood, pour concrete, put things together. All of those things use a lot of energy. And in each minute interaction, you’re expending energy in ways that increase entropy. If the creationist interpretation of the second law were true, you couldn’t build a house, because building a house involves creating something structured – creating order.”

So, let’s take your example, remove the human/intelligence from the equation, and let’s see what happens.

1. MarkCC Post author

And that has exactly *what* to do with the second law of thermodynamics?

All you’re doing is pulling out a red herring.

This post isn’t intended to be a response to every possible creationist argument. It’s just pointing out that the creationists’ claim about the second law of thermodynamics is not true. The second law doesn’t say what creationists think that it says – it never says anything like “you can’t create order”, or “disorder must increase”. And even if it did, the creation argument would still be wrong, because it’s perfectly possible to reduce entropy locally – as long as the decrease is offset by a larger increase in a larger environment.

2. Mike Elzinga

Ok, Joe, let’s do just that. It is fairly easy to find systems in nature – as well as systems that humans make – that illustrate the basic concepts of entropy.

One such system is a two-state system. It can be magnetic domains immersed in a magnetic field; it can be fluorescent atoms embedded in a matrix of other atoms making up a rock. Man-made examples can also be illustrated, but that is not necessary since such systems can be found in nature.

So here is an easy one for you: a two-state system comprised of 16 identical atoms (embedded in a matrix of other atoms) but these 16 atoms each have a non-degenerate ground state and a single accessible excited state to the radiation impinging on these atoms.

1. What is the entropy when all atoms are in the ground state?

2. Add just enough energy to put 4 atoms in the excited state. What is the entropy now?

3. Add more energy so that 8 atoms are in the excited state. What is the entropy now?

4. Add still more energy so that 12 atoms are in the excited state. What is the entropy now?

5. Add more energy so that all atoms are in the excited state. What is the entropy now?

6. Rank-order the temperatures in each of the above cases.

If you are able to do this little exercise, you should encounter some results that run exactly counter to your ideas about entropy and the second law of thermodynamics. In fact, you should discover that everything you have been told by ID/creationists about entropy and the second law of thermodynamics is wrong.

Let us know what you find out.

20. Pingback: 2LoT trouble | The Skeptical Zone

21. Arthur Dent

“one way of viewing evolution is that it’s a process of creating ever more effective entropy-generators.”

Damn, I always suspected we’re part of the problem.
When, in a 100 trillion years from now, the universe wil no longer be able to form new stars at all, we can proudly say ‘we did our part’.

22. Craig

This post is wrong. There is more than one definition of entropy. The definition that the poster is using is the limited definition that one learns in a first year course in college physics. But entropy is more than just heat and work and temperature. This only scratches the surface.

Look up the definition of entropy with respect to information theory. There you will see that there is a precise definition of entropy that clearly quantifies the notion of disorder. And this definition can be shown to be a generalisation of the definition that the poster gave. Hence, the poster is wrong to claim that the argument that evolution contradicts the 2nd law of thermodymamics is invalid.

1. markcc Post author

Actually, no.

I’ve written a bit about information theory here, before. And the thing is, the second-law argument fails on multiple levels.

Informational entropy doesn’t mean the same thing as physical entropy. They’re related concepts, but they’re not exactly the same thing. What we talk about with the second law of thermodynamics is physical entropy.

But if we extend the notion of informational entropy in a form where we encode the state of the universe into a string of information, and measure the entropy of that string, we wind up right back at exactly the same argument.

That is, evolution only looks like it violates the 2nd law if you look at an open system. If you look solely at the fragment of information that represents a particular DNA sequence, under some circumstances, the entropy of that string will be reduced. But it will only be reduced because it’s part of a larger system, in which the reduction of entropy in the gene sequence is produced as a part of larger process that increases overall entropy by more than the gene sequence increases it.

Let’s try a quick metaphor. I’m sitting here typing this comment on this post. In the process of doing that, I’m creating a highly ordered string of information.

I’m not violating the second law of thermodynamics by doing that: sure, I’m decreasing entropy in a local domain, but in the process of doing so, my body is consuming energy and producing a large quantity of entropy. On balance, the production of a lower entropy state in the memory of this computer is more than outweighed by the production of a higher state of entropy in the larger system that includes both the computer, and my body, and the sources of energy that ultimately power my body.

Similarly, when this comment gets transmitted to you, it’s creating a lower state of entropy in your computer than the random contents of memory that would be there otherwise. But getting this comment onto your screen creates a whole lot of entropy – among other things, it’s producing a load of heat in your computer’s CPU.

As usual, the stupid second-law argument falls on its face. I don’t think that anyone who knows any science and math actually believes this rubbish. I think it’s just a game that they play to trick the rubes.