Uncommon Descent Serving The Intelligent Design Community

A note: On entropy, the Macro-micro information gap [MmIG] and the OOL challenge of getting from Darwin’s pond-state to living cell state (a gated encapsulated metabolising automaton with informationally controlled self-replication) without intelligently directed organising work (IDOW)

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Sal C has begun a series of UD posts on entropy, thermodynamics and info challenges.

I have thought it important to highlight the macro-micro info gap issues underscored by Jaynes et al, and to raise the issue of spontaneously moving from Darwin’s Pond-state to cell-state (whether in increments or not does not materially affect the point) without intelligently directed organising work. This sets the context for the design inference on OOL, in light of the significance of our broad base of experience on the source of FSCO/I:

Materials + Energy sources + IDOW –> FSCO/I

Where FSCO/I is evident from functional specificity and complexity of organised entities. Such may also directly store information in physical data structures such as control tapes. Of course, D/RNA is just such a data-storing string.

So, let me clip comment 2 in the thread, and augment it with a picture:

_____________

>> Lets start with

S = k*log W, per Boltzmann

where W is the number of ways that mass and/or energy at ultra-microscopic level may be arranged, consistent with a given Macroscopic [lab-level observable] state.

That constraint is crucial and brings out a key subtlety in the challenge to create functionally specific organisation on complex [multi-part] systems through forces of blind chance and mechanical necessity.

FSCO/I is generally deeply isolated in the space of raw configurational possibilities, and is not normally created by nature working freely. Nature, working freely, on the gamut of our solar system or of the observed cosmos, will blindly sample the space from some plausible, typically arbitrary initial condition, and thereafter it will undergo a partly blind random walk, and there may be mechanical dynamics at work that will impress a certain orderly motion, or the like.

(Think about molecules in a large parcel of air participating in wind and weather systems. The temperature is a metric of avg random energy per degree of freedom of relevant particles, usually translational, rotational and vibrational. At the same time, the body of air as a whole is drifting along in the wind that may reflect planetary scale convection.)

Passing on to Shannon’s entropy in the information context (and noting Jaynes et al on the informational view of thermodynamics that I do not see adequately reflected in your remarks above — there are schools of thought here, cf. my note here on), what Shannon was capturing is average info per symbol transmitted in the case of non equiprobable symbols; the normal state of codes. This turns out to link to the Gibbs formulation of entropy you cite. And, I strongly suggest you look at Harry S Robertson’s Statistical Thermophysics Ch 1 (Prentice) to see what it seems from appearances that your interlocutors have not been telling you. That is, there is a vigorous second school of thought within physics on stat thermo-d, that bridges to Shannon’s info theory.

Wikipedia bears witness to the impact of this school of thought:

At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann’s constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing.

But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon’s information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also,another article remarks: >>in the words of G. N. Lewis writing about chemical entropy in 1930, “Gain in entropy always means loss of information, and nothing more” . . . in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell’s demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox).

So, when we see the value of H in terms of uncommunicated micro- level information based on lab observable state, we see that entropy, traditionally understood per stat mech [degrees of micro-level freedom], is measuring the macro-micro info-gap [MmIG], NOT the info we have in hand per macro-observation.

The subtlety this leads to is that when we see a living unicellular species of type x, providing we know the genome, through lab level observability, we know a lot about the specific molecular states from a lab level observation. The MmIG is a lot smaller, as there is a sharp constraint on possible molecular level configs, once we have a living organism in hand. When it dies, the active informationally directed maintenance of such ceases, and spontaneous changes take over. The highly empirically reliable result is well known: decay and breakdown to simpler component molecules.

{Mignea gives a good idea of what this implies:

Fig. A: Mignea’s schematic of the requisites of kinematic self-replication, showing duplication and arrangement then separation into daughter automata. This requires stored algorithmic procedures, descriptions sufficient to construct components, means to execute instructions, materials handling, controlled energy flows, wastes disposal and more. (Source: Mignea, 2012, slide show; fair use.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

(Source: Mignea, 2012, slide show; fair use. Presentation speech is here.) }

We also know that in the period of historic observation and record — back to the days of early microscopy 350 years back, this is passed on from generation to generation by algorithmic processes. Such a system is in a programmed, highly constrained state governed by gated encapsulation, metabolic automata that manage an organised flow-through of energy and materials [much of this in the form of assembled smart polymers such as proteins] backed up by a von Neumann self-replicator [vNSR].

We can also infer on this pattern right back to the origins of cell based life, on the relevant macro-traces of such life.

So, how do we transition from Darwin’s warm pond with salts [or the equivalent] state, to the living cell state?

The dominant OOL school, under the methodological naturalism imposition, poses a claimed chem evo process of spontaneous cumulative change. This runs right into the problem of accessing deeply isolated configs spontaneously.

For, sampling theory and common sense alike tell us that pond state — due to the overwhelming bulk of configs and some very adverse chemical reaction equilibria overcome in living systems by gating, encapsulation and internal functional organisation that uses coded data and a steady flow of ATP energy battery molecules to drive algorithmic processes — will be dominant over spontaneous emergence at organised cell states (or any reasonable intermediates).

There is but one empirically confirmed means of getting to FSCO/I, namely design.

In short, on evidence, the info-gap between pond state and cell state, per the value of FSCO/I as sign, is best explained as being bridged by design that feeds in the missing info and through intelligently directed organising work [IDOW] creates in this case a self replicating micro-level molecular nanotech factory. That self replication also uses an information and organisation-rich vNSR, and allows a domination of the situation by a new order of entity, the living cell.

So, it is vital for us to understand at the outset of discussion that the entropy in a thermodynamic system is a metric of missing information on the microstate, given the number of microstate possibilities consistent with the macro-observable state. That is, entropy measures the MmIG.

Where also, the living cell is in a macro-observable state that initially and from generation to generation [via vNSR in algorithmically controlled action on coded information], locks down the number of possible states drastically relative to pond state. The debate on OOL, then is about whether it is a credible argument on observed evidence in the here and now, for pond state, via nature operating freely and without IDOW, to go to cell-state. (We know that IDOW routinely creates FSCO/I, a dominant characteristic of living cells.)

A common argument is that raw injection of energy suffices to bridge the info-gap without IDOW, as the energy flow and materials flows allow escape from “entropy increases in isolated systems.” What advocates of this do not usually disclose, is that raw injection of energy tends to go to heat, i.e. to dramatic rise in the number of possible configs, given the combinational possibilities of so many lumps of energy dispersed across so many mass-particles. That is, MmIG will strongly tend to RISE on heating. Where also, for instance, spontaneously ordered systems like hurricanes are not based on FSCO/I, but instead on the mechanical necessities of Coriolis forces acting on large masses of air moving under convection on a rotating spherical body.

(Cf my discussion here on, remember, I came to design theory by way of examination of thermodynamics-linked issues. We need to understand and visualise step by step what is going on behind the curtain of serried ranks of algebraic, symbolic expressions and forays into calculus and partial differential equations etc. Otherwise, we are liable to miss the forest for the trees. Or, the old Wizard of Oz can lead us astray.)

A good picture of the challenge was posed by Shapiro in Sci AM, in challenging the dominant genes first school of thought, in words that also apply to his own metabolism first thinking:

RNA’s building blocks, nucleotides, are complex substances as organic molecules go. They each contain a sugar, a phosphate and one of four nitrogen-containing bases as sub-subunits. Thus, each RNA nucleotide contains 9 or 10 carbon atoms, numerous nitrogen and oxygen atoms and the phosphate group, all connected in a precise three-dimensional pattern. Many alternative ways exist for making those connections, yielding thousands of plausible nucleotides that could readily join in place of the standard ones but that are not represented in RNA. That number is itself dwarfed by the hundreds of thousands to millions of stable organic molecules of similar size that are not nucleotides [–> and he goes on, with the issue of assembling component monomers into functional polymers and organising them into working structures lurking in the background] . . . .

[–> Then, he flourishes, on the notion of getting organisation without IDOW, merely on opening up the system:] The analogy that comes to mind is that of a golfer, who having played a golf ball through an 18-hole course, then assumed that the ball could also play itself around the course in his absence. He had demonstrated the possibility of the event; it was only necessary to presume that some combination of natural forces (earthquakes, winds, tornadoes and floods, for example) could produce the same result, given enough time. No physical law need be broken for spontaneous RNA formation to happen, but the chances against it are so immense, that the suggestion implies that the non-living world had an innate desire to generate RNA. The majority of origin-of-life scientists who still support the RNA-first theory either accept this concept (implicitly, if not explicitly) or feel that the immensely unfavorable odds were simply overcome by good luck.

Orgel’s reply in a post-humus paper, is equally revealing on the escape from IDOW problem:

If complex cycles analogous to metabolic cycles could have operated on the primitive Earth, before the appearance of enzymes or other informational polymers, many of the obstacles to the construction of a plausible scenario for the origin of life would disappear . . . Could a nonenzymatic “metabolic cycle” have made such compounds available in sufficient purity to facilitate the appearance of a replicating informational polymer?

It must be recognized that assessment of the feasibility of any particular proposed prebiotic cycle must depend on arguments about chemical plausibility, rather than on a decision about logical possibility . . . few would believe that any assembly of minerals on the primitive Earth is likely to have promoted these syntheses in significant yield. Each proposed metabolic cycle, therefore, must be evaluated in terms of the efficiencies and specificities that would be required of its hypothetical catalysts in order for the cycle to persist. Then arguments based on experimental evidence or chemical plausibility can be used to assess the likelihood that a family of catalysts that is adequate for maintaining the cycle could have existed on the primitive Earth . . . .

Why should one believe that an ensemble of minerals that are capable of catalyzing each of the many steps of [for instance] the reverse citric acid cycle was present anywhere on the primitive Earth [8], or that the cycle mysteriously organized itself topographically on a metal sulfide surface [6]? The lack of a supporting background in chemistry is even more evident in proposals that metabolic cycles can evolve to “life-like” complexity. The most serious challenge to proponents of metabolic cycle theories—the problems presented by the lack of specificity of most nonenzymatic catalysts—has, in general, not been appreciated. If it has, it has been ignored. Theories of the origin of life based on metabolic cycles cannot be justified by the inadequacy of competing theories: they must stand on their own . . . .

The prebiotic syntheses that have been investigated experimentally almost always lead to the formation of complex mixtures. Proposed polymer replication schemes are unlikely to succeed except with reasonably pure input monomers. No solution of the origin-of-life problem will be possible until the gap between the two kinds of chemistry is closed. Simplification of product mixtures through the self-organization of organic reaction sequences, whether cyclic or not, would help enormously, as would the discovery of very simple replicating polymers. However, solutions offered by supporters of geneticist or metabolist scenarios that are dependent on “if pigs could fly” hypothetical chemistry are unlikely to help.

So, we have to pull back the curtain and make sure we first understand that the sense in which entropy is linked to information in a thermodynamics context is that we are measuring missing info on the micro-state given the macro-state. So, we should not allow the similarity of mathematics to lead us to think that IDOW is irrelevant to OOL, once a system is opened up to energy and mass flows.

In fact, given the delicacy and unfavourable kinetics and equilibria involved — notice all those catalysing enzymes and ATP energy battery molecules in life? — the challenge of IDOW is the elephant standing in the middle of the room that ever so many are desperate not to speak about.>>

_____________

Okay, a note for record.

So, no comments here, you are invited to comment in the Sal C thread, here. END

PS: This earlier ID Foundations post may also be helpful.