Uncommon Descent Serving The Intelligent Design Community

2nd Law of Thermodynamics — an argument Creationists and ID Proponents should NOT use

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

ID proponents and creationists should not use the 2nd Law of Thermodynamics to support ID. Appropriate for Independence Day in the USA is my declaration of independence and disavowal of 2nd Law arguments in support of ID and creation theory. Any student of statistical mechanics and thermodynamics will likely find Granville Sewell’s argument and similar arguments not consistent with textbook understanding of these subjects, and wrong on many levels. With regrets for my dissent to my colleagues (like my colleague Granville Sewell) and friends in the ID and creationist communities, I offer this essay. I do so because to avoid saying anything would be a disservice to the ID and creationist community of which I am a part.

 [Granville Sewell  responds to Sal Cordova here. ]

I’ve said it before, and I’ll say it again, I don’t think Granville Sewell 2nd law arguments are correct. An author of the founding book of ID, Mystery of Life’s Origin, agrees with me:

“Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life.”

Walter Bradley, Thermodynamics and the Origin of Life

To begin, it must be noted there are several versions of the 2nd Law. The versions are a consequence of the evolution and usage of theories of thermodynamics from classical thermodyanmics to modern statistical mechanics. Here are textbook definitions of the 2nd Law of Thermodynamics, starting with the more straight forward version, the “Clausius Postulate”

No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

and the more modern but equivalent “Kelvin-Plank Postulate”:

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

How then can such statements be distorted into defending Intelligent Design? I argue ID does not follow from these postulates and ID proponents and creationists do not serve their cause well by making appeals to the 2nd law.

I will give illustrations first from classical thermodynamics and then from the more modern versions of statistical thermodynamics.

The notion of “entropy” was inspired by the 2nd law. In classical thermodynamics, the notion of order wasn’t even mentioned as part of the definition of entropy. I also note, some physicists dislike the usage of the term “order” to describe entropy:

Let us dispense with at least one popular myth: “Entropy is disorder” is a common enough assertion, but commonality does not make it right. Entropy is not “disorder”, although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased “order”, quite impossible in the entropy is disorder worldview. And also keep in mind that “order” is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and “disorder” are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay “Entropy, God and Evolution”.

What is Entropy? by Tim Thompson

From classical thermodynamics, consider the heating and cooling of a brick. If you heat the brick it gains entropy, and if you let it cool it loses entropy. Thus entropy can spontaneously be reduced in local objects even if entropy in the universe is increasing.

Consider the hot brick with a heat capacity of C. The change in entropy delta-S is defined in terms of the initial hot temperature TH and the final cold temperature TM:

Supposing the hot temperature TH is higher than the final cold temperature TM, then Delta-s will be NEGATIVE, thus a spontaneous reduction of entropy in the hot brick results!

The following weblink shows the rather simple calculation of how a cold brick when put in contact with a hot brick, reduces spontaneously the entropy of the hot brick even though the joint entropy of the two bricks increases. See: Massachussetts Institute of Technology: Calculation of Entropy Change in Some Basic Processes

So it is true that even if universal entropy is increasing on average, local reductions of entropy spontaneously happen all the time.

Now one may argue that I have used only notions of thermal entropy, not the larger notion of entropy as defined by later advances in statistical mechanics and information theory. But even granting that, I’ve provided a counter example to claims that entropy cannot spontaneously be reduced. Any 1st semester student of thermodynamics will make the calculation I just made, and thus it ought to be obvious to him, than nature is rich with example of entropy spontaneously being reduced!

But to humor those who want a more statistical flavor to entropy rather than classical notions of entropy, I will provide examples. But first a little history. The discipline of classical thermodynamics was driven in part by the desire to understand the conversion of heat into mechanical work. Steam engines were quite the topic of interest….

Later, there was a desire to describe thermodynamics in terms of classical (Newtonian-Lagrangian-Hamiltonian) Mechanics whereby heat and entropy are merely statistical properties of large numbers of moving particles. Thus the goal was to demonstrate that thermodynamics was merely an extension of Newtonian mechanics on large sets of particles. This sort of worked when Josiah Gibbs published his landmark treatise Elementary Principles of Statistical Mechancis in 1902, but then it had to be amended in light of quantum mechanics.

The development of statistical mechanics led to the extension of entropy to include statistical properties of particles. This has possibly led to confusion over what entropy really means. Boltzmann tied the classical notions of entropy (in terms of heat and temperature) to the statistical properties of particles. This was formally stated by Plank for the first time, but the name of the equation is “Boltzmann’s entropy formula”:

where “W” (omega) is the number of microstates (a microstate is roughly the position and momentum of a particle in classical mechanics, its meaning is more nuanced in quantum mechanics). So one can see that the notion of “entropy” has evolved in physics literature over time….

To give a flavor for why this extension of entropy is important, I’ll give an illustration of colored marbles that illustrates increase in the statistical notion of entropy even when no heat is involved (as in classical thermodynamics). Consider a box with a partition in the middle. On the left side are all blue marbles, on the right side are all red marbles. Now, in a sense one can clearly see the arrangement is highly ordered since marbles of the same color are segregated. Now suppose we remove the partition and shake the box up such that the red and blue marbles mix. The process has caused the “entropy” of the system to increase, and only with some difficulty can the original ordering be restored. Notice, we can do this little exercise with no reference to temperature and heat such as done in classical thermodynamics. It was for situations like this that the notion of entropy had to be extended to go beyond notions of heat and temperature. And in such cases, the term “thermodynamics” seems a little forced even though entropy is involved. No such problem exists if we simply generalize this to the larger notion of statistical mechanics which encompasses parts of classical thermodynamics.

The marble illustration is analogous to the mixing of different kinds of distinguishable gases (like Carbon-Dioxide and Nitrogen). The notion is similar to the marble illustration, it doesn’t involve heat, but it involves increase in entropy. Though it is not necessary to go into the exact meaning of the equation, for the sake of completeness I post it here. Notice there is no heat term “Q” for this sort of entropy increase:

where R is the gas constant, n the total number of moles and xi the mole fraction of component, and Delta-Smix is the change in entropy due to mixing.

But here is an important question, can mixed gases, unlike mixed marbles spontaneously separate into localized compartments? That is, if mixed red and blue marbles won’t spontaneously order themselves back into compartments of all blue and red (and thus reduce entropy), why should we expect gases to do the same? This would seem impossible for marbles (short of a computer or intelligent agent doing the sorting), but it is a piece of cake for nature even though there are zillions of gas particles mixed together. The solution is simple. In the case of Carbon Dioxide, if the mixed gases are brought to a temperature that is below -57 Celcius (the boiling point of Carbon Dioxide) but above -195.8 Celcius (the boiling point of Nitrogen), the Carbon Dioxide will liquefy but the Nitrogen will not, and thus the two species will spontaneously separate and order spontaneously re-emerges and entropy of the local system spontaneously reduces!

Conclusion: ID proponents and creationists should not use the 2nd Law to defend their claims. If ID-friendly dabblers in basic thermodynamics will raise such objections as I’ve raised, how much more will professionals in physics, chemistry, and information theory? If ID proponents and creationists want to argue that the 2nd Law supports their claims but have no background in these topics, I would strongly recommend further study of statistical mechanics and thermodynamics before they take sides on the issue. I think more scientific education will cast doubt on evolutionism, but I don’t think more education will make one think that 2nd Law arguments are good arguments in favor of ID and the creation hypothesis.

UPDATE:
Dr. Sewell has been kind enough to respond. He pointed out my oversight of not linking to his papers. My public apologies to him. His response contains links to his papers.
Response to scordova

UPDATE:
at the request of Patrick at SkepticalZone, I am providing a link to his post which provides links to others discussions of Dr. Sewell’s work. I have not read these other critiques, and for sure, they may not have much attempt at civility. But because Skeptical Zone has been kind enough to publish my writings, I have some obligation to reciprocate. Again, I have not read those critiques. What I have written was purely in response to the Evolution News and Views postings regarding the topic of thermodynamics.

Patrick’s Links to other critiques

Comments
After a year, Groovamos is still pathologically incapable of admitting his mistakes. He still posts drivel at UD: https://uncommondescent.com/philosophy/when-designed-errors-are-the-perfect-design/#comment-461286 Hey groovamos, have you figured out you messed up.
Hey Groovamos, Have you been able to figure out where that “dT” is in the first equation. Acknowledge, please.
Hahaha!scordova
July 4, 2013
July
07
Jul
4
04
2013
10:12 AM
10
10
12
AM
PDT
Thanks, Sal, for the thoughts. I still need to digest your essay and all the detailed comments. First, I've never quite understood Sewell's position in the past. That is almost certainly a lack of effort and time on my part, and I'm willing to say at this point that he might have some cogent thoughts on the topic. I need to dig deeper when I get some time. Second, however, something in your essay caught my eye: "Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life." With apologies to Bradley, this is nonsense. The whole issue of open/closed systems is a complete and utter red herring to the question of naturalistic origins. Whether a system is "closed" or "open" is purely one of convenient semantics, without any substance whatsoever. Specifically, we could say that the Earth is an "open" system, because it gets energy from the Sun. So what? I can also assert that the Earth/Sun system is a "closed" system. You want to include comets, asteroids, other planets, cosmic rays from deep space, etc.? Fine, I'll just define my system more broadly and declare that it is a "closed" system, with all the available material resources of the universe of course being the ultimate closed system. The entire discussion is pointless, and anyone asserting that "evolution can happen because Earth is an open system" is spouting nonsense. I realize Bradley did not make that specific statement, but I've heard it many times and it rests on the same misunderstanding. There are two possibilities: 1. If Bradley is making a very technical point that the second law of thermodynamics is simply irrelevant to the formation of information-rich biological structures from abiotic precursors, then fine, we can have a discussion about what the second law really is and what it really says and whether it applies to information and the like. (I suspect that this might be an area where Sewell's arguments are weakest.) 2. In contrast, if Bradley is asserting (as he seems to be) that even if life can't arise in a "closed" system it could arise in an "open" system, then he is playing non-substantive semantic games and perpetuating nonsense.Eric Anderson
August 9, 2012
August
08
Aug
9
09
2012
12:26 PM
12
12
26
PM
PDT
Hey Groovamos, Have you been able to figure out where that “dT” is in the first equation. Acknowledge, please. :-)scordova
August 9, 2012
August
08
Aug
9
09
2012
07:25 AM
7
07
25
AM
PDT
Hey Groovamos, Have you been able to figure out where that "dT" is in the first equation. You know, the one you claim is missing but is still right there under you nose. This ain't like the hunt for the God Particle you know. You just need the ability to see. If you don't have that, then you shouldn't be driving. Acknowledge, please. :-)scordova
July 9, 2012
July
07
Jul
9
09
2012
07:57 AM
7
07
57
AM
PDT
F/N 2: Oh yes, I forgot. The ice trays also allowed a model of diffusion to be explored, which is central to understanding the reason entropy and information are so closely linked. Start with a painted line down the middle, and have some beads on one side, A. B has none to begin with. Shake up -- just a tad -- and inspect, repeatedly. You can trace the strong tendency of beads to even out on both sides, with the same basic distribution. In short, we see a strong trend of migration from clusters of microstates that have low statistical weights in the config space, to those that have high weights. And not the reverse. We see time's arrow, and we see why entropy is associated with disorder. Of course, it soon becomes evident, too, that the reason is that we have a config space, with vastly more dispersed than clumped together possibilities. Then, imagine that we have to get a given pattern of distributed specific parts in proper alignment for a function to emerge. Say, a triangle of cells, with the right colour (or its neighbouring colour: Position 1 must be red or orange or pink, two must be yellow or green, three must be blue, kight blue or purple, etc., where once the triangle pattern is there in any config of trays, upside down, right hand or left hand is acceptable) in the proper place, etc. This is strictly possible but so isolated in config space relative to scattered unaligned states that we have no right to expect such to emerge. The concept, islands of function in a space of mostly non-functional possibilities, naturally emerges. Then, we can step up to the parts moving about by Brownian motion in a vat thought exercise. We soon see why it is maximally improbable and utterly implausible for first life to emerge as a metabolic entity with a code based algorithmic self-replication facility by thermodynamic forces and chemistry, in a warm little pond etc. The FSCO/I involved is best explained as a result of design. Design is in the door and sitting at the table from the very root of the tree of life. So, why not call on it as a powerful means of explanation at later levels, where we see signs for which the only empirically observed and analytically warranted explanation is design? BTW, if one appeals to sidelined chunks of junk dna varying at random until voila, function emerges and is picked up, this means unconstrained variation. A random walk search for islands of function. And, once we have parts that must be right and in the right relative alignment, we have islands of function to face. KFkairosfocus
July 7, 2012
July
07
Jul
7
07
2012
03:27 AM
3
03
27
AM
PDT
F/N: I should explain: beads = lumps of energy, and cells = atoms. Ground state being 0. Put beads, shake up, count trays with 0, 1, 2, 3 etc. The at-random distributions emerge quite naturally. With few beads, 0 dominates and a reverse J occurs. As number of beads goes up, we get to a point where the modal number of beads is more than 0. I found this little exercise was more helpful than a raft of abstruse, blindly memorised mathematics. It then helped students see what was going on behind the usual hieroglyphics that students struggle with across two alphabets and odd forms, For instance, the integral sign is an elongated S for sum. The delta used for a small increment is a Greek letter. so is the Sigma used in summation. And students often struggle to follow all those beloved subscripted indices etc. Not to mention the lore of algebra much less calculus and differential equation theorems. Then, mix in the approximations and frank virtuoso trickery -- this is NOT a compliment -- in too many derivations. Then, we have the tendency to skip and not even hint at key steps in derivations. Top off with the silly convention to now lay out the naked strings of formulae without explanation. (I grew up in the old fashioned Geometric system, where steps of reasoning were explained: Given, xyz = ABS, then ADB --> BC, by so and so's principle, etc.) We have not touched on the even weirder symbols of logic!kairosfocus
July 7, 2012
July
07
Jul
7
07
2012
02:59 AM
2
02
59
AM
PDT
Somebody is trying to popularise entropy! More power to him. (BTW, to help students see what was going on behind introductory stat mech stuff, I once tried an exercise with students using beads and small-cube ice trays, to communicate how the same sort of randomness would emerge. As number of beads grows from few relative cells, we can see a reverse-J turn gradually into a bell.)kairosfocus
July 7, 2012
July
07
Jul
7
07
2012
12:50 AM
12
12
50
AM
PDT
kf, Have you read anything at all by Arieh Ben-Naim? http://www.ariehbennaim.com/Mung
July 6, 2012
July
07
Jul
6
06
2012
09:49 PM
9
09
49
PM
PDT
Sal, what is your background in: 1. Mathematics 2. ThermodynamicsMung
July 6, 2012
July
07
Jul
6
06
2012
09:45 PM
9
09
45
PM
PDT
Sal et al: Kindly, cf my remarks here. KFkairosfocus
July 6, 2012
July
07
Jul
6
06
2012
12:55 AM
12
12
55
AM
PDT
"...rather wether life’s emergence is consistent with the supposed mechanisms of origin of life and evolution."
I suppose I largely agree. My point was rhetorical, in that evolution, and the origin of life for that matter, lack a verifiable natural mechanism which doesn't itself rely upon the prior configuration of living systems - unless one includes RM+NS, which shows no sign of getting the job done. The supposed mechanisms are not forthcoming, which makes for difficult comparisons. Add to that the current academic environment, which has defined non-material causes for material effects as outside of scientific investigation, hence outside of verifiable knowledge.Chance Ratcliff
July 5, 2012
July
07
Jul
5
05
2012
11:11 PM
11
11
11
PM
PDT
Yes I’m left wondering which mechanism of evolution, exactly, is compatible with the 2nd law. * is or isn’t compatible ^^
The proper way to frame the argument is not whether evolution is compatible or incompatible with laws, rather wether life's emergence is consistent with the supposed mechanisms of origin of life and evolution. Why is this distinction important? Because living systems transcends physical laws just like software transcends physical law and hardware by definition. If we suppose evolutionary mechanisms are based on physical law, but then demonstrate that life's essential features are independent of physical law, then your question becomes moot. How, you may ask, can this be done? The outline was provided in a little known work that appeared in cell biology international. https://uncommondescent.com/intelligent-design/perfect-architectures-which-scream-design/
Natural mechanisms are all highly self-ordering. Reams of data can be reduced to very simple compression algorithms called the laws of physics and chemistry. No natural mechanism of nature reducible to law can explain the high information content of genomes. This is a mathematical truism, not a matter subject to overturning by future empirical data. The cause-and-effect necessity described by natural law manifests a probability approaching 1.0. Shannon uncertainty is a probability function (log2 p). When the probability of natural law events approaches 1.0, the Shannon uncertainty content becomes miniscule (log2 p = log2 1.0 = 0 uncertainty). There is simply not enough Shannon uncertainty in cause-and-effect determinism and its reductionistic laws to retain instructions for life. Prescriptive information (instruction) can only be explained by algorithmic programming. Such DNA programming requires extraordinary bit measurements often extending into megabytes and even gigabytes. That kind of uncertainty reflects freedom from law-like constraints.
scordova
July 5, 2012
July
07
Jul
5
05
2012
10:10 PM
10
10
10
PM
PDT
Allen, As a follow up, I want to go into why thermal entropy should not be associated with our notions of disorder. It is an abuse of language, imho. I will use an analogy, and it will make far more sense if you dissociate the word "entropy" from "disorder". Consider that we have 5 people and I gave you 10 $1 bills to give to the 5 people. There are limited number of ways you could distribute it: 1. Give all $10 to one person and $0 to the other 4 2. Give $2 to all 5 people 3. Give $6 to 1 person and $4 to the other 4 etc. The amount of money that each person has is his microstate and the way the money is distributed among the population is the microstate of the population. There will be only a limited number of ways that money can be distributed and this number corresponds to the "money entropy" of the system. Notice "disorder" is not a very good way to described the simple counting method of the number of creative ways we can distribute the wealth! Now if we had more money to give, like say $100, there will be even more ways to spread the wealth. We can give: 1. $100 to one person $0 to the the other 4 2. $96 to one person $1 to the other 4 3. $92 to one person $2 to the other 4 etc. Clearly there are more ways to spread the wealth when there is more money to give. This increase in the number of ways to spread the wealth can be said to correspond to higher "money entropy". The word "disorder" is not really appropriate to describe the increase of "money entropy". How does this relate to thermal entropy? Let the money correspond to the total energy of the system. Thus instead of dollars we are talking Joules or KiloWatt Hours... The molecules are the people. The amount of kinetic energy that each molecule has is its energy microstate. Instead of dollars the microstate of each molecule is characterized by Joules or KiloWatt hours (well, electron volts are a more appropriate unit)... Entropy is simply a measure of the ways that the total energy (like wealth) can be distributed. In quantum mechanics, where energy comes in discrete packets (like $1 bills are discrete) the analogy fits well and is actually easier to see. It's not so easy using classical mechanics which makes the counting analogy much more difficult, but Gibbs figured ways around this (since there was no mature Quantum theory in his day). The kinetic energy of a molecule in classical mechanics is 1/2 m v^2 suffice to say there is a corresponding quantum description of energy, but the nice thing is that in quantum mechanics, the energy is in nice discrete amounts, which makes the counting of microstates nicer. Hopefully this shows why "disorder" is not descriptive to the notion of thermal entropy. Mike Elzinga provided a means of testing understanding of these concepts. I fumbled some of my calculations to answer his questions about entropy in 16 molecules. His example illustrates the procedures for calculating entropy from the energy microstates of the atoms. See here for the discussion which illustrates the calculation (and some of my mistakes and subsequent corrections through others): The Skeptical Zone: 2nd Law, comment 14657 That said, I've mentioned I think "disorder" may be appropriate to describe non-thermal entropies. This obviously will be a source of confusion!scordova
July 5, 2012
July
07
Jul
5
05
2012
08:10 PM
8
08
10
PM
PDT
Greetings, Sal: I’m curious if you might agree that the Boltzman-Gibbs version of the second law would perhaps be easier to apply to the “reverse video” argument. I use it in my biology classes at Cornell when discussing the second law and found that students very quickly understand the concept of microstates and how they relate to entropy.
Allen! How are you? I knew you got into some hot water for teaching an ID class a few years back. I can now appreciate what it feels like to be a dissenting voice among my own colleagues. I feel your pain... :-) With respect to your question, I don't think I personally would teach it in that way. Here are my reasons. There are various kinds of entropy, and thermal entropy is only one form of entropy. Further, a few physicists get indigestion at the thought that thermal entropy means more disorder. There are some forms of entropy that might be described as disorder, other forms not. Oh and that's the other thing, there is more than one form of entropy! Thermal entropy is in principle easier to measure if one has: 1. thermometers 2. means of estimating mass or other physical properties like volume and pressure, etc. Thermal entropy is measured in Joules/Kelvin. A Joule is 1/3600000 kilowatt hours. But how are other forms of entropy measured? That is a real problem. Rob Sheldon points out:
My (Nobel nominated) college professor used to ask a rhetorical question in his thermo class, “what is the entropy change of a cow after the butcher put a 22 calibre bullet through the brain?” Yet this miniscule entropy change is supposed to tell us the difference between a “living” and “dead” cow. From a physics view point, there is almost no change in disorder, yet from a biological viewpoint it is all the difference in the world. Physics just doesn’t know how to measure this, and doesn’t even know if they ever can measure this quantity.
A student of thermodynamics will be perplexed. We know disorder has been introduced, but yet we can't make statements of how much the total entropy has increased. We can only measure the thermal entropy change (with things like thermometers and weight scales, etc.) but there is no objective measure of the real mechanical damage that has been done. It cannot be stated in terms of Joules/Kelvin, and that is a problem if one wishes to invoke physics.... If you take a computer from inside warm hose out into the cold on a winter day and then crush it with a sledge hammer. While your are doing that, the cold winter is cooling the computer and thus lowering its thermal entropy. The result is that invocation of thermodynamics doesn't really help our understanding of the deterioration of mechanical order especially when, formally speaking, thermal entropy can be going down while mechanical disorganization is going up. A physicist can estimate the change in thermal entropy of the smashed computer using a thermometer to a couple significant figures, but he can't give you the increase in entropy of the mechanical disorganization introduced by the sledge hammer. The problem is that ORDER and ORGANIZATION of such macroscopic objects is in the eye of the beholder. This subjective quality of what we perceive as organized vs. disorganized has relevance to design arguments, but I digress..... This inabilty to measure and quantify the mechanical disorder makes me cringe at the thought of using the 2nd law to described the ravaging of Tornados. One can use the 2nd law to describe the change of thermal entropy, beyond that, we go into some serious controversy, imho.
Greetings, Sal: I’m curious if you might agree that the Boltzman-Gibbs version of the second law would perhaps be easier to apply to the “reverse video” argument. I use it in my biology classes at Cornell when discussing the second law and found that students very quickly understand the concept of microstates and how they relate to entropy.
I have some who will disagree with me (like Mike Elzinga) but in physics and engineering and biology where there are some notions of entropy other than thermal that are formally measured. I will also make the bold statement, that the problem of subjectivity is painfully creeping in. Non-thermal entropy can be meausred to various significant figures in Joules/Kelvin, but it is in the eye of the beholder. Let me list the notions that I'm aware of: 1. Thermal Entropy 2. Mixing Entropy 3. Configurational Entropy (like the configuration of a protein) Configurational Entropy plays a role in ID, but it depends on what one wishes to label as ordered. Why? It is in the eye of the beholder when a protein (much less an organ) becomes functional or partially functional (recall the eyes of blind cave fish!) What one man says is ordered, another man can say is disordered (just like the arguments over non-coding DNA!)..... I gave an example of mixing entropy. The marbles were an illustration only, but the actual entropy of mixing with gases can be measured or inferred with great accuracy to significant figures in terms of Joules/Kelvin. This is one case where change in entropy can be measured without any recourse to things like thermometers. But even then, there is a subjective element creeping in because what we deem as ORDERED is somewhat in the eye of the beholder. Here is a quote from a wiki article:
This insight suggests that the idea of thermodynamic state and entropy are somewhat subjective. http://en.wikipedia.org/wiki/Gibbs_paradox
What is the problem. Perhaps the marble illustration will help. The red and blue marble mixing illustrates increase in entropy (not thermal entropy by mixing entropy). But what if I then soaked the box of marbles in dye so that all marbles become purple and thus indistinguishable? Has the mixing entropy suddendly disappered? The same result could happen if we merely wore glasses that would make us color blind. Or to further illustrate, let's start out with the two compartments of red and blue marbles but also with unique numbers on them. We start shaking the box and mixing the marbles. We can say the entropy increased. But if we started out the experiment with color blind glasses we might be tempted to say that entropy has not increased unless we observe the fact that each marble is pretty much in a different location from where it started. We can do this because we can identify the individual marbles, but if there were no identifying marks (or if we ignored the identifying marks) on the marbles we'd be saying the entropy has not changed! Thus our statements about the change in entropy are driven in part by our ability to or willingness to measure it. This feels not so wholesome.... On the one hand we could say in principle the entropy has increased, but since we can't distinguish the particles, we might say it hasn't. If there is no utility (or ability) in actually distinguishing the particles, then we arbitrarily say there is no change in entropy. And this is the Gibbs Paradox.... The point being, the non-thermal forms of entropy are subjective, and once we get to macroscopic objects like bacteria, cows, cars, etc. the subjectivity of what constitute organized versus disorganized becomes painfully in evidence. Unlike thermal entropy which can be determined by thermometers, the non-thermal entropies (especially when measuring macroscopic organization like cars) is in the eye of the beholder. Measurement almost become meaningless except in terms maybe of utlity of the observer! with that on the table, back to your question:
Greetings, Sal: I’m curious if you might agree that the Boltzman-Gibbs version of the second law would perhaps be easier to apply to the “reverse video” argument. I use it in my biology classes at Cornell when discussing the second law and found that students very quickly understand the concept of microstates and how they relate to entropy.
With thermal entropy, the tornado analogy is not appropriate to describe the microstates. When heat is applied to a pile of dirt, its thermal entropy goes up. It is dubious to say the microstates become more disordered in the way the tornado goes through a town. Many physicists will not feel comfortable saying thermal entropy increase corresponds to more disorder in the microstates. The entropy in this case corresponds merely to the number of microstates that can achieve the macroscopic properties (like temperature) of the object. It's a bit esoteric, but suffice to say, disorder is a very forced way to describe the situation. I wouldn't feel comfortable using the term for thermal entropy. Contrast this to increase in mixing entropy or configurational entropy (such as a protein disintegrating) where position matters. It would be fair to say that the microstate become more disordered. I feel more comfortable with the notion of "disorder" here.... The problem is the 2nd Law in engineering practice deals mostly with thermal entropy. The more foundational discipline of statistical mechanics deals with thermal and mixing and configurational entropy. Invocation of the 2nd law for non-thermal entropies is very forced, imho, and that is the current point of contention in this thread. I may not have answered your question, but I hope it provides insights into the subject matter.scordova
July 5, 2012
July
07
Jul
5
05
2012
06:19 PM
6
06
19
PM
PDT
* is or isn't compatible ^^Chance Ratcliff
July 5, 2012
July
07
Jul
5
05
2012
11:29 AM
11
11
29
AM
PDT
Yes I'm left wondering which mechanism of evolution, exactly, is compatible with the 2nd law.Chance Ratcliff
July 5, 2012
July
07
Jul
5
05
2012
11:28 AM
11
11
28
AM
PDT
KF- I understand and believe you. Accumulations of random events do not construct muklti-part machinery. I'm just saying that we don't need it because there isn't anything to refute, yet.Joe
July 5, 2012
July
07
Jul
5
05
2012
11:20 AM
11
11
20
AM
PDT
Joe: Would it surprise you to learn that it was thermodynamics issues -- with TBO's TMLO playing a key part -- that brought me to look at design theory in the first place? KFkairosfocus
July 5, 2012
July
07
Jul
5
05
2012
10:58 AM
10
10
58
AM
PDT
Sal felt so proud of his "coming out" that he had to announce it on an evo blog. But Sal does have a point- we don't need this argument because materialism, and by extention evolutionism, are totally void of substance. And taht means we can invoke the late Christopher Hitchens:
“That which can be asserted without evidence, can be dismissed without evidence.”
Joe
July 5, 2012
July
07
Jul
5
05
2012
09:21 AM
9
09
21
AM
PDT
hi PaV: I think the key thing is to appreciate that thermodynamic barriers like this are not hard "it is forbidden" but instead it becomes so probabilistically implausible on reasonably accessible resources that a certain kind of observation is maximally unlikely. That is, it is a practical -- as opposed to in-principle -- impossibility. It is strictly possible for a tornado to assemble a house out of components, but it is so implausible per the relevant clusters of accessible possibilities, that it is unobservable on the gamut of our observed cosmos. KFkairosfocus
July 5, 2012
July
07
Jul
5
05
2012
07:50 AM
7
07
50
AM
PDT
Can you please tell me why am I still after one day in moderation?hallowach
July 5, 2012
July
07
Jul
5
05
2012
07:49 AM
7
07
49
AM
PDT
I think there is a fundamental quality to the 2nd Law that is overlooked. Namely, that discussions of meaningful entropy change (of which "life" is one) must involve "barriers." For example, Sal's first model involved two bricks, one hot and one cold. But how could these two bricks become hot or cold unless they're separated from one another? In fact, Sal's example has the two bricks coming together in the end, wherein the two exchange and reach an equilibrium. This is the fundamental point: disequilibrium, of which many go to make up "life", can only come about by a barrier/partition/separation of some sort. In Sal's second example, consisting of marbles, what does he do to "increase" entropy? He removes a partition! And, so, to get a "decrease" of entropy, a barrer will be needed. This is where the 2nd Law shows that "life" cannot come about via random forces. Why? Because too many barriers are needed. A cell has a membrane; it has its organelles, it has its own genetic system in place that must work, in some measure, apart from the rest of the cell (think of transcription). How did these barriers arise? How are they maintained? Where did the energy come from that maintained these barriers? This is a fundamental way of seeing how the 2nd Law militates against OOL sans ID. Sono in Italia; allora, ciao!PaV
July 5, 2012
July
07
Jul
5
05
2012
06:17 AM
6
06
17
AM
PDT
F/N 2: The following from my always linked, app 1, uses a model of diffusion to give a bridge between microscopic and classical views, that is relevant: _________ >> 4] Yavorski and Pinski, in the textbook Physics, Vol I [MIR, USSR, 1974, pp. 279 ff.], summarise the key implication of the macro-state and micro-state view well: as we consider a simple model of diffusion, let us think of ten white and ten black balls in two rows in a container. There is of course but one way in which there are ten whites in the top row; the balls of any one colour being for our purposes identical. But on shuffling, there are 63,504 ways to arrange five each of black and white balls in the two rows, and 6-4 distributions may occur in two ways, each with 44,100 alternatives. So, if we for the moment see the set of balls as circulating among the various different possible arrangements at random, and spending about the same time in each possible state on average, the time the system spends in any given state will be proportionate to the relative number of ways that state may be achieved. Immediately, we see that the system will gravitate towards the cluster of more evenly distributed states. In short, we have just seen that there is a natural trend of change at random, towards the more thermodynamically probable macrostates, i.e the ones with higher statistical weights. So "[b]y comparing the [thermodynamic] probabilities of two states of a thermodynamic system, we can establish at once the direction of the process that is [spontaneously] feasible in the given system. It will correspond to a transition from a less probable to a more probable state." [p. 284.] This is in effect the statistical form of the 2nd law of thermodynamics. Thus, too, the behaviour of the Clausius isolated system above is readily understood: importing d'Q of random molecular energy so far increases the number of ways energy can be distributed at micro-scale in B, that the resulting rise in B's entropy swamps the fall in A's entropy. Moreover, given that FSCI-rich micro-arrangements are relatively rare in the set of possible arrangements, we can also see why it is hard to account for the origin of such states by spontaneous processes in the scope of the observable universe. (Of course, since it is as a rule very inconvenient to work in terms of statistical weights of macrostates [i.e W], we instead move to entropy, through s = k ln W. Part of how this is done can be seen by imagining a system in which there are W ways accessible, and imagining a partition into parts 1 and 2. W = W1*W2, as for each arrangement in 1 all accessible arrangements in 2 are possible and vice versa, but it is far more convenient to have an additive measure, i.e we need to go to logs. The constant of proportionality, k, is the famous Boltzmann constant and is in effect the universal gas constant, R, on a per molecule basis, i.e we divide R by the Avogadro Number, NA, to get: k = R/NA. The two approaches to entropy, by Clausius, and Boltzmann, of course, correspond. In real-world systems of any significant scale, the relative statistical weights are usually so disproportionate, that the classical observation that entropy naturally tends to increase, is readily apparent.) >> _________ The link to the pivotal issue on the table, to explain the rise of functionally specific complex organisation to build first life, and thence novel body plans, should be clear. Such things are so isolated in the field of possible configs that they are not credibly observable on the gamut of the observed cosmos or solar system. At least by blind chance and mechanical necessity. but, FSCO/I is readily and routinely observed as the product of intelligence. Dr Sewell is right. A tornado spontaneously assembling a house from its components is not a credible observation, on the principles of the second law. The same holds for the vastly more complex living cell. And before you go off yelling Hoyle Fallacy, I suggest you look at the task of spontaneously forming a universal constructor with a von Neumann self replicator controlled by a coded tape. For, that is a minimal requirement for metabolising self replicating life. Where also, no self-replication, no evolution by differential performance and locking in of genetic improvements. Recall, first life looks like 100K plus bits of genetic info, and novel body plans 10 - 100 mn bits, where 500 - 1,000 swamps the atomic resources of our solar system and observed cosmos. The thought exercise on forming a microjet in a vat by diffusion in app 1 here, should also be instructive.kairosfocus
July 5, 2012
July
07
Jul
5
05
2012
05:19 AM
5
05
19
AM
PDT
F/N: It is clear that the thread has unfortunately largely served as an occasion for tangential, irrelevant and distractive objections. I again draw attention to the key issues in 6 above. The spontaneous formation of a metabolising, von Neumann algorithmic self replicator based entity has nothing to do with oil water separation due to sharp differences in inter-molecular forces due to water being a polar molecule. It has everything to do with why diffusion does not spontaneously undo itself, why random text does not spontaneously form long passages of coherent meaning and function of 73 or more ASCII characters etc etc. All of these have been highlighted for a long time and have been ducked, dodged or obfuscated. I suggest that the key clips cited at 6 and the 1984 discussion in TMLO chs 7 - 9 be used for a point of departure for serious discussion, taking in what Sewell actually has argued. If that is not taken up, it should serve to show to the astute onlooker that we are unfortunately back at the all too familiar design objector rhetorical tactic of red herrings and strawmen. KFkairosfocus
July 5, 2012
July
07
Jul
5
05
2012
04:55 AM
4
04
55
AM
PDT
Hey Groovamos, Are you still having problems understanding that the symbolic phrase CdT includes the exact differential dT. You're making it seem you have little comprehension of basic math notation. Do you not understand CdT means the heat capacity multiplied by the exact differential dT? :-) If I inserted an extra dT as you suggested, then that eqaution would be wrong. Are you going to confess your error or will I have to keep repeating it until you recant and admit you're the one with analytical problems since apparently you can't discern "CdT" means "C" multiplied by "dT"! Can you explain to the UD readers how you made such an error? How were you proposing to amend the equation? Were you going to suggest using CdTdT? :-) Oh I know, you might be thinking of the Java naming convention where different words are delimited by capitalization....yeah right. Unforuntately, I pointed out that C is heat capacity, thus you have no excuse to say you read it as "Cd" multiplied by "T". Not to mention this equation will be found in thermodynamic texts, and not to mention the equation was sourced from the linked MIT website. So are you going to fess up on your egregious error or will I have to keep reminding readers about it. :-) I mean it's unbelievable you couldn't notice the integrand that has CdT/T has the term "dT" right there to the right of "C". How could you miss it? Do you not understand basic 1st semester calculus? How can you claim you didn't see it since you supposedly scrutinized the equation closely enough to make the erroneous claim that "dT" wasn't in the equation when it clearly was. How can you say that the symbolic phrase "CdT" does not include "dT"? Do you not understand basic math notation? How can you say you didn't see the "dT" following the C. That's awfully hard to miss. Are you feeling OK. Maybe you shouldn't be driving if your vision is having such problems of reading omission whereby you can't see the "dT" that is only milimeters to the right of "C" in the phrase "CdT"!scordova
July 5, 2012
July
07
Jul
5
05
2012
12:42 AM
12
12
42
AM
PDT
Dr. Sewell and I have come to a rare point of agreement, and I will highlight the point with which I agree from the above link to his response to me:
Obviously the origin and evolution of life do not violate the second law as stated in the early formulations you quote, but there are many formulations of this law, some more general than others.
And what formulation have I quoted? Clausis and Kelvin Planck. What formulations are the most prominent? According to Wiki on the 2nd law
The second law of thermodynamics may be expressed in many specific ways,[4] the most prominent classical statements[3] being the statement by Rudolph Clausius (1854), the statement by Lord Kelvin (1851), and the statement in axiomatic thermodynamics by Constantin Carathéodory (1909). These statements cast the law in general physical terms citing the impossibility of certain processes. They have been shown to be equivalent.
So the formulations I provided constitute 2 of the 3 major ones in use. According to wiki the 2 I quoted are equivalent to the 3rd formulation by Caratheodory, but I have not been able to verify this for myself, but if so, then the origin and evolution of life do not violate the three major formulations of the second law of thermodynamics. This is consistent with what Walter Bradley said. Reference: http://en.wikipedia.org/wiki/Second_law_of_thermodynamics If ID proponents wish to use 2nd law arguments, they'll have to work from formulations of the law that are not the major ones. People have a freedom of choice in this matter, but that's not the route I would take or advise others to take. I'd like to thank Dr. Sewell for responding.scordova
July 4, 2012
July
07
Jul
4
04
2012
08:53 PM
8
08
53
PM
PDT
Thanks for the response, scordova. Perhaps it's easier to suggest that none of the five laws precludes ID, nor gives any support to naturalistic origins. ;-) It's clear that the tornado running backwards violates some axiomatic principle. I couldn't say why no major laws of physics suggest an explanation, given that the impropriety of such an occurrence is immediately obvious. The default conclusion would seem to be that tornadoes which assemble rubble into structures violate no known physical laws.Chance Ratcliff
July 4, 2012
July
07
Jul
4
04
2012
07:38 PM
7
07
38
PM
PDT
Assuming there is no 2nd law violation, what law does the spontaneous generation of functional complexity violate, exactly?
Walter Bradley list 5 of the major laws of physics: 1. clasical mechanics 2. electrodynamics 3. statistical mechanics (and thermodynamics) 4. quantum mechanics 5. relativity It would be hard to say only one such law in isolation supports ID or precludes naturalistic origins of life. Bradley points out ID is not so much trying to say that physical laws make designs impossible except through intelilgence but rather only a narrow set of conditions (which only intelligence can provide) acting through those laws allow designs to be achieved.
It takes energy to purposefully order and organize arrangements of matter into information.
Energy is a necessary but not sufficient condition. An atomic bomb has enough energy in principle to build a city, but well....scordova
July 4, 2012
July
07
Jul
4
04
2012
06:29 PM
6
06
29
PM
PDT
Kudos to you Sal for being willing to be upfront in correcting a common error that, unfortunately, runs deep in the minds of a number of ID proponents. We should never accept a view simply because it is mainstream - be it in the mainstream non-teleological community or in the mainstream ID community.Genomicus
July 4, 2012
July
07
Jul
4
04
2012
05:55 PM
5
05
55
PM
PDT
If one reads Sewell, one inevitably comes to the conclusion that a perfectly-mixed suspension of oil and water can never, ever spontaneously separate into two perfectly-separated phases. The numbers are simply too prohibitive. (Readers here should be able to figure this out - just calculate the probability that 10^23 water molecules and 10^23 oil molecules can spontaneously find their ways to opposite and separate layers of, say, a cruet. The numbers make Dembski's UPB look positively inevitable.) Sewell's real problem is that he believes that chemistry is not a factor. Because of this, he is ignorant of the fact that life, and evolution, exist because of the second law, not in spite of it. The oil/water example nicely illustrates the fundamental underlying chemistry. A disclaimer - if you believe Sewell, don't try the oil-water experiment. We wouldn't want to violate the second law and have to deal with the catastrophic consequences.Arthur Hunt
July 4, 2012
July
07
Jul
4
04
2012
04:25 PM
4
04
25
PM
PDT
1 2

Leave a Reply