Uncommon Descent Serving The Intelligent Design Community

Gordon Davisson’s Talk Origins Post of the Month (October 2000)

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

I’m not aware of any policy restriction on providing links to TalkOrigins at UD. I’ve linked to TalkOrigins a lot since I’ve criticized it so much in the past.

In this case, I actually concur with one of Gordon Davisson’s essays there, and I think it rightly earned Post of the Month 14 years ago at TalkOrigins.

When I teach ID, I feel the ideas must defensible to science undergraduate and graduate students and science faculty (physics, chemistry, mathematics, engineering). The internet has been an opportunity to vet some of my ideas. Gordon has been of great assistance with his balanced and insightful critiques of my essays at UD.

I link to the Gordon’s essay because we independently arrived at the same conclusion regarding the relationship of Thermodynamic entropy and Shannon Entropy, namely:

1 Joule/Kelvin = 1 / (1.381 x 10-23) / ln (2) Shannon Bits =

1.045 x 1023 Shannon Bits

provided the microstates under consideration for Shannon entropy are the energy microstates.

In the 500 fair coins example, I typically say the Shannon entropy is 500 bits because I consider the heads/tails configurations to be the symbols generally under consideration, but if we consider instead the energy microstates at the molecular level of 500 coins, one will get on the order of 8.636 x 1025 bits — a number which dwarfs even the supposed Universal Probability Bound of 500 bits! 😯

I made the derivation in two places. Mainly at:
Shannon Information, Entropy, Uncertainty in Thermodynamics and ID

and indirectly in:

Creation Evolution University: Clausius, Boltzmann, Dembski

Here is Gordon’s essay:
Information and Thermo-Entropy. I present his essay in in gratitude for much of his constructive criticisms of my work at UD.

Technical comments and corrections are welcome.

Comments
Salvador:
What is the entropy contribution due this burst of energy from the heater?
What does it even meant to say that this burst of energy contributes entropy?Mung
August 2, 2014
August
08
Aug
2
02
2014
09:34 PM
9
09
34
PM
PDT
Gordon Davisson:
Ok, I'm feeling foolish, so I'll take the plunge and try to defend the idea that thermodynamics is connected to information theory (though the connection is probably not what you'd expect).
I don't have a clear idea yet to what extent thermodynamics is connected to information theory, but certainly the entropy concept in thermodynamics is connected to the entropy concept in information theory. Shannon's Measure of Information (SMI) is defined for any probability distribution. The entropy concept in thermodynamics is defined only for some probability distributions. As such, entropy in thermodynamics is a special case of SMI (Ben-Naim 2013). Manfred Eigen (2013) takes it a step further:
An entropy can be assigned to any probability distribution: Then, entropy is nothing but the average logarithm of all the probabilities in a (possibly very steep) distribution. In fact, we shall encounter entropy in many other situations, ones which are hardly related to thermodynamics.
Mung
August 2, 2014
August
08
Aug
2
02
2014
09:20 PM
9
09
20
PM
PDT
GD: A replication process for a DNA of relevant length is a manufacturing process of enormous implied functionally specific complex organisation and associated information. That is why it is wise to start considerations in a Darwin's warm pond or equivalent with "plausible" initial molecules, then move up from there. I am sure, you will agree that the config space for such a pond or ensemble, will be huge and there is a serious search space challenge to blindly -- per chance and mechanical necessity -- get to a gated, encapsulated, metabolising automaton with coded von Neumann self replicating facility. Thereafter, to get to novel body plans poses an even bigger FSCO/I challenge, given that functional specificity and complexity imply isolated clusters of configs in the space. The result is, so far we have a theory of micro-evo within islands of smoothly varying function [well behaved fitness], rather than what is advertised, an empirically credible and analytically plausible scientific account of origin of main body plans [macro-evo], starting with the origin of the first. And, I include the first, as it is normal to include such in presentations to secondary or tertiary students and the general public, with an air of supreme, lab coat-clad confidence. KFkairosfocus
July 31, 2014
July
07
Jul
31
31
2014
03:09 AM
3
03
09
AM
PDT
Yep; entropy's status as a state function is pretty central to how it's used in thermodynamics, so the questions the information connection raises are ... interesting. To illustrate the problem, suppose we constructed an RNA strand with a random sequence of bases. From what I said in the article, the randomness of the sequence will add to its entropy. Now, suppose we replicated the strand. The second strand is also random, but the process that made it could only produce one sequence. So they're identical, but the second has less entropy? And it has less entropy because of how it was made? That's a clear violation of the principle that entropy only depends on a system't current state, not how it got into that state. So what sense can we make of this? At least in this case, I think clear that they both have the same (higher) entropy, but that the entropy of the two together is less than sum of their individual entropies. In information theory, we'd say their joint entropy is less than the sum of their separate entropies, and the difference, called their joint information, is a measure of how much their messages (microstates) are correlated. This solution works neatly in this case, but what about an RNA strand constructed by a process that produces a deterministic sequence? In that case, there might not be a clear other thing that you can say it's correlated to. I think the approach still works, it's just messy in practice. But I wouldn't take that opinion too seriously. I haven't thought this through all that thoroughly, and I'm far out of touch with the current research and literature on the subject. I also wouldn't worry about it too much -- the "problem" is too small to matter in practice. In fact, I wouldn't be too unhappy saying that entropy is only approximately a state function.Gordon Davisson
July 30, 2014
July
07
Jul
30
30
2014
07:38 PM
7
07
38
PM
PDT
Gordon Davisson:
I've been working on this essay for a while now, but I don't really consider it complete yet (I haven't finished the reading I should do, and there are several more things that I need to cover -- especially the question of entropy's status as a state function
But that's a pretty important question to answer isn't it? Arieh Ben-Naim writes:
Answering the questions, "What is entropy?" and "Why does it change in one direction?" leaves open the question "How does the system move from one state to another?" Fortunately, neither thermodynamics nor statistical mechanics deals with the question of how the system evolves from an initial to the final state. We always assume that both the initial and the final states are equilibrium states. Thus, both thermodynamics and statistical mechanics deal with the difference in the entropy between two equilibrium states. This is the essence of the meaning of a "state function." It is a function, the value of which is determined by the state of the system, not how the system reached that state. - Entropy and the Second Law: Interpretation and Misss-Interpretationsss p. 203
Mung
July 29, 2014
July
07
Jul
29
29
2014
04:55 PM
4
04
55
PM
PDT
There's also no policy restriction against letting people discuss Gordon's paper here. If you don't wish to allow comments in a thread you can turn them off.Mung
July 27, 2014
July
07
Jul
27
27
2014
04:47 PM
4
04
47
PM
PDT
Thanks for the link; I'm remembering a policy from maybe a decade ago, and might well be thinking of something from a completely different site. Since it's no longer in force (if it ever was), I won't worry about it. BTW, I'd also like to thank you for contributing some clarity to the debate about thermodynamics -- my experience has been that most people (on both sides of the pro/anti-evolution debate) don't really understand thermodynamics very well, but a disturbing number (again on both sides) nonetheless feel the need to argue about it. The result is lots of misunderstandings being hurled back and forth... So it's distinctly gratifying when people on opposite sides can actually agree about something like this (and the agreement seems to stem from understanding, not just exhaustion).Gordon Davisson
July 24, 2014
July
07
Jul
24
24
2014
10:20 PM
10
10
20
PM
PDT
Sal, we have no internal policy restrictions on providing links to anything. We avoid sites that are *apparently* in violation of copyright (but are not Internet experts on that). Crude language/other unsuitability warnings are highly advisable as needed. One problem is that, as the Internet is new to the judiciary, some court judgments have upheld the idea that linking to something illegal or objectionable constitutes an offense. That is hardly likely to apply to anything at the site you mention, or others similar.News
July 24, 2014
July
07
Jul
24
24
2014
08:54 AM
8
08
54
AM
PDT

Leave a Reply