Uncommon Descent Serving The Intelligent Design Community

ID theorist Bill Dembski responds to Adami’s claims about spontaneous emergence of life

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Being as Communion

Which we covered here: Applying information theory to the origin of life?, where Adami said,

“The information-theoretic musings I have presented here should convince even the skeptics that, within an environment that produces monomers at relative ratios not too far from those found in a self-replicator, the probabilities can move very much in favour of spontaneous emergence of life,” concludes Adami.

Dembski replies,

The probabilities can move very much in favour of spontaneous emergence of life provided you introduce a search that makes the probabilities high (as by Adami’s “simplifying assumptions”).

Yeah.

Sort of like Martha Stewart does everything better than me, except I never get to see her army of frazzled assistants. For all I know, maybe she doesn’t either.

By the way, here’s more on Dembski’s book, Being as Communion. I’ve read it; it’s a game-changer. Order now if you can stand Brit ship hassle. – O’Leary for News

See The Science Fictions series at your fingertips (origin of life) for a rundown on why no naturalist origin of life theory works.

Follow UD News at Twitter!

Comments
EDTA, Statistical Thermodynamics With Applications to the Life Sciences Introduces statistical thermodynamics (statistical mechanics) and partition functions (kf @8) and then shows how entropy can be given meaning as a special case of Shannon's Measure of Information (SMI).
How do we find that number of bits?
I think the answer to this is by finding the function that maximizes the entropy. From the above book:
Shannon did not seek a measure of the general concept of information, only a measure of information contained in, or associated with a probability distribution. p. 65
Within this large subset, denoted SMI, we identify an even smaller subset denoted "Entropy." This subset contains all the distributions relevant to a thermodynamic system at equilibrium. Clearly, these are only a tiny subset of distributions on which the SMI is definable. There is a subtle point in formulating the principles of maximum entropy in thermodynamics, and the maximum SMI in all other fields. In thermodynamics, the maximum of entropy is over the manifold of constrained equilibrium states. In all other cases, and especially the cases discussed in this chapter, we use the principle of maximum SMI over all possible distributions. But it is only the maximal SMI which we identify as entropy. For all distributions, except those that maximize SMI, the system is not at equilibrium. Therefore, for such systems the entropy is not even defined. p. 118
Mung
September 15, 2014
September
09
Sep
15
15
2014
06:46 PM
6
06
46
PM
PDT
PS: Clip from sect A my note: >> Summarising Harry Robertson's Statistical Thermophysics (Prentice-Hall International, 1993) -- excerpting desperately and adding emphases and explanatory comments, we can see, perhaps, that this should not be so surprising after all. (In effect, since we do not possess detailed knowledge of the states of the vary large number of microscopic particles of thermal systems [typically ~ 10^20 to 10^26; a mole of substance containing ~ 6.023*10^23 particles; i.e. the Avogadro Number], we can only view them in terms of those gross averages we term thermodynamic variables [pressure, temperature, etc], and so we cannot take advantage of knowledge of such individual particle states that would give us a richer harvest of work, etc.) For, as he astutely observes on pp. vii - viii: . . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . And, in more details, (pp. 3 - 6, 7, 36, cf Appendix 1 below for a more detailed development of thermodynamics issues and their tie-in with the inference to design; also see recent ArXiv papers by Duncan and Samura here and here): . . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy, cf. discussions here, here, here, here and here; also Sarfati's discussion of debates and the issue of open systems here . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [. . . "my" Eqn 6] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.] >>kairosfocus
September 14, 2014
September
09
Sep
14
14
2014
10:13 PM
10
10
13
PM
PDT
EDTA: If you want it full-bore try Ch 1 of Robertson's Statistical Thermophysics, which I excerpt and summarise in my note, but beware you need to recognise what a Partition Function is etc, this is not a simple subject. The bottomline is, there is a difference between what entropy is measuring (and degrees of molecular freedom lead to the same import), and how one transfers over into energy units linked to conventional scales. As a first look, notice, Boltzmann's expression S = k_B log W, is looking at W the number of ways mass and energy can be arranged at micro levels compatible with macro conditions, i.e. a metric of degree of uncertainty already. Take a log and up to a multiplication by -1, you have an info metric [the underlying inference is equiprobability]. k_B converts to energy linked terms. The Gibbs expression is already in terms of adjusted probabilities and is directly comparable. Yes, many still dispute but it is not hard to see that there is a serious point. Indeed, probabilities other than 0 and 1 are measures of degree of ignorance, or on the dual, of information. KFkairosfocus
September 14, 2014
September
09
Sep
14
14
2014
10:06 PM
10
10
06
PM
PDT
KF, thanks for the link to your page. I'll try to get through that, but there's a lot there! I have read published papers however that claim to point out allegedly irreconcilable differences between Shannon and thermo entropy. See an example at this link. And feel free to set me straight here, since I'm not a physicist, but one question has always bugged me if indeed thermo entropy is a measure of ignorance about the molecular state of some system. If that is the case, then there should be some absolute number of bits which would suffice to describe the system completely. I.e., some amount of information that could pinpoint the particles to within, say, 100nm regions. How do we find that number of bits? Things like Boltzmann's constant don't contain terms that would cancel with volume, moles, etc. I glanced at your page, KF, and see that you approach this matter. I will try to get through it sometime. Thanks.EDTA
September 14, 2014
September
09
Sep
14
14
2014
08:50 PM
8
08
50
PM
PDT
EDTA: I suggest you acquaint yourself with the informational school of thermodynamics, e.g. this admission against ideological interest at Wikipedia:
At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate. >>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).
As Appendix A my always linked briefing note . . . click on my handle on this or any other comment I make here at UD . . . . underscores, mere opening up of a system is not enough to overcome the linked search space challenge. Indeed, it is easy to show that Darwin's warm little pond was energy importing, and that an energy importing system naturally INCREASES entropy. What generally creates shaft work and its product, organisation or at least order, is a mechanism that couples inflowing energy to convert some of it to that work [= ordered, forced motion], and exhausts waste heat. Sometimes such heat engines or energy conversion devices appear spontaneously, such as a hurricane. But in every case where the device in question produces FSCO/I rich organisation etc, where we observe the source directly, the cause is an intelligent one. This has to do with the implied beyond-astronomical atomic resources search challenge discussed here, for instance. After all the handwaving and dismissive talk often dressed up in a lab coat) is done, there is in fact no empirically anchored reasonable explanation of how a gated, encapsulated, metabolising automaton with a code using, algorithmic self maintaining and replicating facility could and did arise. Instead we too often see an attempted redefinition of science that is ideologically loaded with evolutionary materialism as a censoring a priori. If you doubt me, here is the US NSTA (National Science Teachers Assoc'n) Board in 2000, claiming to speak in the name of science and science education:
The principal product of science is knowledge in the form of naturalistic concepts and the laws and theories related to those concepts . . . . Science, by definition, is limited to naturalistic methods and explanations and, as such, is precluded from using supernatural elements in the production of scientific knowledge. [[NSTA, Board of Directors, July 2000]
(And there is a lot more where that came from.) Philip Johnson's retort to Lewontin is still apt in reply to all such ideological impositions on scientific discussion of origins:
For scientific materialists the materialism comes first; the science comes thereafter. [[Emphasis original] We might more accurately term them "materialists employing science." And if materialism is true, then some materialistic theory of evolution has to be true simply as a matter of logical deduction, regardless of the evidence. That theory will necessarily be at least roughly like neo-Darwinism, in that it will have to involve some combination of random changes and law-like processes capable of producing complicated organisms that (in Dawkins’ words) "give the appearance of having been designed for a purpose." . . . . The debate about creation and evolution is not deadlocked . . . Biblical literalism is not the issue. The issue is whether materialism and rationality are the same thing. Darwinism is based on an a priori commitment to materialism, not on a philosophically neutral assessment of the evidence. Separate the philosophy from the science, and the proud tower collapses. [[Emphasis added.] [[The Unraveling of Scientific Materialism, First Things, 77 (Nov. 1997), pp. 22 – 25.]
KFkairosfocus
September 14, 2014
September
09
Sep
14
14
2014
01:27 AM
1
01
27
AM
PDT
What kind of idiotic babble is that? If the environment produces monomers at a similar ratios as a self replicator? It then moves the odds in favor of the origin of life? So if the environment produces amino acids (monomers) in the right ratios you get a living cell? No. No you don't. You get a bunch of amino acids is what you get.Jehu
September 13, 2014
September
09
Sep
13
13
2014
09:43 PM
9
09
43
PM
PDT
Reducing the origin of life to the origin of information does not actually reduce the problem. It does however allow materialists to do a lot of hand waving and smuggling in of information which is not obvious to those of us who are mathematically naive. Just as Avida "proved" evolution could happen in silico, so these guys will soon "prove" that origin of life can happen in silico. I am thankful that we have Professor Dembski and other maths whizzes who can call them out.idnet.com.au
September 13, 2014
September
09
Sep
13
13
2014
09:28 PM
9
09
28
PM
PDT
Interesting article by Adami. I was hooked shortly into the article, where the author says, "...there is no remnant of the original set of molecules that began their fight against the second law of thermodynamics." So here is yet another admission by a non-ID, non-YEC that life is a fight against the Second Law. Haven't we had this conversation before? Wasn't Rob Sheldon or someone involved in explaining that with ideal gasses, we understand entropy, but with non-gas systems, we barely have the ability to describe the associated entropy mathematically? I get that the formulas for thermodynamical entropy and information theoretic entropy have the same form. But it makes no sense to me to equate thermo entropy (measured in something like joules/kelvin mole) with bits of information except under extremely limited circumstances. So is this researcher off in the weeds here in speaking about thermodynamic entropy in the same breath as search probabilities involving linear molecule chains?EDTA
September 13, 2014
September
09
Sep
13
13
2014
09:11 PM
9
09
11
PM
PDT
OT: God and Science - Part 1 | Dr. Hugh Ross (Sept. 7, 2014) https://www.youtube.com/watch?v=LcGX_6AiHKI&list=UUauB3xW5bf0STGotAxLj4vA the God of the universe is actually the God of the scientific world as well.bornagain77
September 13, 2014
September
09
Sep
13
13
2014
05:13 PM
5
05
13
PM
PDT
I wonder if he has adolescent or grown-up children who kid him that they still believe in Father Christmas - because they know he's happier believing it.Axel
September 13, 2014
September
09
Sep
13
13
2014
03:41 PM
3
03
41
PM
PDT

Leave a Reply