Uncommon Descent Serving The Intelligent Design Community

Just Too Simple

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

For me, the real argument for intelligent design has always been extremely simple, and doesn’t require any advanced mathematics or microbiology to grasp. The video below makes this argument in the simplest, clearest way I can make it. My uncle Harry and aunt Martha like the video, and can’t understand why so many intelligent scientists aren’t impressed by this very simple argument.

Of course the problem is, the argument is just too simple, most scientists aren’t interested in arguments that their uncle Harry and aunt Martha can understand, they are looking for arguments that require some advanced technology, that show some understanding of evolutionary theory or microbiology that sets them apart from uncle Harry and aunt Martha. And indeed, most of the important scientific advances in our understanding of our world have required advanced technology and advanced degrees to achieve, but it is the curse of intelligent design that the strongest and clearest arguments are just too simple to get much traction in the scientific world. Of course there are many good arguments for ID being made now which do require advanced technology, and advanced degrees to understand, and I’m very grateful for the scientists who are making them: it’s clear to me if ID ever becomes widely accepted in the scientific world, it will be because of their writings, and not because of the simple arguments I am making. If I could figure out a way to use some more advanced mathematics in my arguments, if I could figure out a way to restate the basic point in such a way that uncle Harry and aunt Martha couldn’t understand it, I might make some progress (I don’t really have an uncle Harry or an aunt Martha, by the way, but many people do). Perhaps it would help if I linked to my resume, or to my finite element program, to show that I am capable of doing more advanced mathematics, even if I haven’t used any of it in this video.

The arguments for ID which require advanced science to understand are powerful, but never completely definitive: they look at small portions of the picture through a microscope. To make the completely definitive argument you have to step back and look at the big picture, but, alas, then the picture becomes too clear, and too simple.

Added later:

As I expected, a couple of commentors are trying to make the issue more complicated than it is. Rather than try to answer each objection one at a time, I would refer readers to this ENV post, where I point out that every attempt to argue that the spontaneous rearrangement of atoms on a barren planet into computers, books and airplanes does not violate the second law, can equally be applied to argue that a tornado running backward, turning rubble into houses and cars, would not violate it either. So unless you are willing to argue that tornados running backward would not violate the second law, don’t bother. And even if you are, it is obvious that a tornado running backward would violate some basic law of Nature, if not the second law as formulated by humans, then at least the basic natural principle behind the second law, and what has happened on Earth would clearly violate the same law, whatever it is.

[youtube 259r-iDckjQ]

Comments
Mung: I agree that entropy isn't a physical thing, it's a property of physical things. But I don't agree that thinking of entropy as a thing is a mistake. I think it's a useful intuitive shortcut; a metaphor if you like. While entropy isn't a thing, it acts a lot like a thing. This means that by thinking of it as a thing, you immediately get a bunch of mostly-correct intuitions about how it behaves. For example, when heat flows from one place to another, there's an entropy decrease where it came from and an increase where it went to. The technically correct way to describe this is that the entropy decrease is compensated by the (equal or larger) decrease, but it's far more intuitive to think of the heat carrying entropy with it from one place to the other. There are some places where the metaphor falls down, like the deviations from additivity I described in my last comment. The closest I can come to making sense of this in terms of the metaphor is that some of the same entropy is in multiple places, so if you just add the entropies from the various systems you're counting some of the entropy twice (or more). Basically, you need to be ready to throw out your intuition whenever it disagrees with more detailed analysis; but that's true anyway, so it's not really a change. So, you suffer some technical accuracy and a few subtle bad intuitions in exchange for quite a lot of good intuitions. If you understand the physics and math well, it may not worth the tradeoff. But in most of these discussions most of the participants have no real background in either, so I think it's a worth it.Gordon Davisson
January 27, 2013
January
01
Jan
27
27
2013
08:12 PM
8
08
12
PM
PDT
The mistake here is in thinking that entropy is something physical.Mung
January 27, 2013
January
01
Jan
27
27
2013
04:38 PM
4
04
38
PM
PDT
This is going to be rather long, so I'm going to try to break it down by topic as much as I can. Sorry if it's still a bit scattered... Defending my argument from extensiveness: Rob Sheldon@45 (note that I'm replying to bits of what Rob said out of order):
There was some nonsense in GD@32,33 about “compensation” and “extensible” entropy. Even an ideal gas has cases when the entropy is not extensible (additive), but certainly coherent systems, systems with long-range forces are demonstrably non-extensible.
"Nonsense"? I beg to differ. While strict additivity only applies to systems with statistically independent microstates (I'm not sure what ideal gasses have to do with this), the deviations from additivity do not weaken the argument I made. In the first place the deviations are too small to matter, in the second place they're in a direction that actually strengthens my argument, and in the third place they don't even apply to Sewell's X-entropies (eq. 3 of his AML paper is strictly extensive). Let me concentrate on the second point. The entropies of statistical mechanics (whether we're talking about Boltzmann's formula, Gibbs' more general formula, or Von Neumann's quantum formula) are what's known as subadditive; that is, the entropy of two systems taken together is always less than or equal to the sum of their individual entropies. That means that the entropy of 200 individuals is at most twice the entropy of 100 individuals. This in turn means that, as far as the second law is concerned, going from 0 individuals to 100 individuals is, if anything, easier than going from 100 individuals to 200 individuals. (Just to clarify what should be obvious: in reality, going from 0 individuals to 100 is much harder than 100 -> 200, especially if the individuals happen to be rabbits. From this, I conclude that the second law is not the relevant limiting factor.) (Also, my parallel between evolution vs. population shift doesn't necessarily work properly when deviations from additivity are significant. No problem, just change it to a parallel between evolution vs. extinction of old species + population growth of new species.) Let me give an example of this deviation from additivity: the entropy of genetic information. To keep the math simple, I'm going to use a highly oversimplified model; I'm trying to illustrate the principle here, not calculate realistic numbers. Let's say there are 1,000 (1e3) possible (genetically distinct) species (I said I wasn't going for realism, right?), and within each species there are 10,000 (1e4) possible genomes an individual might have. Suppose some individual randomly poofs into existance. It could have any of 1e7 possible gemomes (1e3 species * 1e4 genomes within each species), so the genetic contribution to its entropy will be S_g(organism 1) = k * ln(1e7) ~= 16*k. Now, suppose that individual reproduces (asexually, to keep things simple). The new organism will be of the same species as the parent, but have a different (assumed random) genome within the same species. If you look at the offspring by itself, it could also be any of 1e3 species * 1e4 genomes, so its entropy will be the same as its parent: S_g(offspring) = k * ln(1e7) ~= 16*k. But look at the genetic entropy of the two together, by counting the number of possible genomes they could have. Since they'll both be the same species, there's only 1e3 species * 1e4 genomes for the parent * 1e4 organisms for the offspring = 1e11 total possibilitles, so S_g(organism 1 + offspring) = k * ln(1e11) ~= 25*k. This is k * ln(1e3) ~= 7*k less than the sum of their individual entropies, which is essentially a measure of how correlated their states are. Compare that with what would've happened if the second organism had appeared independently (rather than deriving from organism 1): then the two organisms would be of independent species, so their total entropy would be the sum of their separate entropies, k * ln(1e14) ~= 32*k. So independent appearance of organisms is thermodynamically preferred to reproduction! (Again, I'm not saying that organisms poofing into existence is possible, just that under conditions that allow reproduction, the second law doesn't forbid it. As Stephen Lower put it: when thermodynamics says "no", it means exactly that. When it says "yes", it means "maybe".) (Also, note that adding organisms -- by whatever process -- adds entropy. So, the second law actually favors both evolution and reproduction, right? No, because the absolute entropy of the organisms isn't really what's important, it's the entropy of the organisms relative to the entropy of the raw materials they formed from. That's why, when I was drawing parallels, I chose comparisons where the raw-materials piece cancels out. For example, the amount of additional raw materials needed to go from 0 individuals to 100 is the same as to go from 100 to 200. If you're not careful about this, you can wind up talking complete nonsense. In fact, anyone who blitheley talks about "the entropy decrease from evolution" without worrying about this is almost certainly talking nonsense. IMO it's actually much better to think in terms of free energy or negentropy, but if you do that anyone who doesn't know some thermo will have no idea what you're talking about.)
There’s even a large coterie of physicists proposing “Tsallis non-extensible entropy” as the solution to life, evolution, and the philosopher’s stone.
I'm not significantly familiar with Tsallis entropy, and I've never heard of anyone relating it to evolution. Can you give me a pointer to this "large coterie"? In any case, I'm pretty sure it's also subadditive, so what I said above goes for it as well. Is thermodynamics even relevant?
The mere fact that people are not collections of ideal gas atoms, but coherent and “functional” should be strong evidence that applying entropy addition to humans is wrong as applying Boyle’s Law to my pot of caramel bubbling on the stove. In fact, the coherence of all the objects alluded to by Granville Sewell in his paper on computers and jet planes, is precisely the sort of order that cannot be measured by Boltzmann’s ideal gas approximations.
You seem to be under the impression that Boltzmann's formula for entropy, S=k*ln(ω), is limited to ideal gasses. If so, you are wrong; it applies to any classical (non-quantum) system with equally probable microstates, whether or not they happen to be ideal gasses. For a classical system with non-equally-probable microstates, use Gibbs' formula, S=-k*sum(p_i*ln(p_i)), instead. Note that Gibbs' formula is a generalization of Boltzmann's: if all of the probabilities (the p_i's) happen to be equal, both formulae give the same result. I'm not very familiar with quantum stat mech, but AIUI the relevant formula there is Von Neumann's, S=-Tr(ρ*ln(ρ)), which is (other than a factor of k) a generalization of Gibbs' formula. If you deny that these formulae are relevant to what Sewell is talking about, you're essentially denying that Sewell's argument is based on thermodynamics (well, stat mech anyway). You can't have it both ways: either Sewell's argument is based on well-established thermodynamics (in which case it's wrong), or it's not based on well-established thermodynamics (in which case he's being dishonest to claim the backing of thermodynamics for his argument). Doing the math: Earth's entropy flux vs. entropy decrease needed for evolution (from earlier in Rob's message:)
All this to say, that Granville is completely correct when he says that the increase in entropy of the Sun is never shown to come even close to explaining the decrease of entropy on the earth. The conversion constants are just not known. In principle, they could be known, but in practice we are far, far from even a rough estimate.
I think you've lost track of the burden of proof here. If anyone wants to use thermodynamics to argue against evolution, the burden of proof is on them to show that there's a conflict. If the conversion constants aren't known, that makes your (and Sewell's) argument difficult, not mine. However, the relevant constants are known (it's the state counting that's hard), and the claim that there's a conflict between evolution and thermo has been refuted in several different ways. In addition to my argument from extensiveness (or subadditivity, if you want to be picky), it's also been done by directly estimating the relevant entropies. Have you read Emory Bunn's article "Evolution and the second law of thermodynamics" (Am. J. Phys. 77 (2009), 922–925)? It's mostly a re-do of the argument Daniel Styer made in an earlier paper, except that Styer made some serious mistakes; Bunn corrects these (although he also makes at least one minor mistake himself, see my earlier comment). Sewell has criticized these arguments, but as far as I can see his criticisms completely miss the mark (again, see my linked comment). Note that neither Styer nor Bunn nor I claim that the Sun is increasing in entropy (I'm pretty sure it's decreasing) let alone that it's compensating for entropy decreases on Earth. The Earth actually recieves entropy from the Sun (in the form of thermal radiation), and dumps its waste entropy to deep space (again, in the form of thermal radiation). The flux is tricky to calculate, and (I claim) both Styer and Bunn get it wrong. I didn't exactly do it right either, but I'm pretty confident I got a safe lower bound of 3.3e14 J/K per second for the net entropy flux leaving Earth. Can you find any error in my analysis in the linked comment? But how much entropy decrease is needed for evolution? Bunn gives an upper limit of 1e44*k = 1.4e21 J/K, which is less than two months' flux (based on my calculation). I'm no biochemist, but his calculation here looks roughly reasonable. I do have one semi-objection to it, though: it looks at the total entropy difference between all the organisms on Earth vs the same matter in its simplest molecular form. In other words, he's counting up the entropy decrease needed for evolution and reproduction and growth etc. and, as my extensiveness argument imples, most of that entropy decrease is due to reproduction and growth, not evolution. On the other hand, the entropy decreases implied by reproduction and growth are coming out of the same available entropy budget (and the breakdown is hard to even define, let alone calculate), so he's not actually wrong... just less specific than I'd like. Do you see any problems with Bunn's analysis? As you said earlier, it's hard to calculate this accurately. But Bunn is only trying for an upper bound, and his calculation would have to be off by a factor of over a billion for there to be a problem with evolution, so unless you see something seriously wrong, I'd say it's good enough to prove his point. Even more to the point, do you have any analysis that shows there isn't enough entropy flux? Because if you don't, I don't see how you can make a thermodynamic case against evolution. Conversons between different "types" of entropy: (still earlier in Rob's message:)
Boltzmann & Shannon both used “order” or permutations in their definition of entropy. Clausius, Maxwell etc, used heat and temperature. Boltzmann’s “ansatz” was to connect the two definitions with the eponymous constant. Landauer repeats the Boltzmann “ansatz” using computer bits instead of permutations, and several published authors have claimed to validate or invalidate Landauer’s ansatz. Personally I think Landauer was using the “ideal gas” estimate of Boltzmann when he reused “k” for his “energy per entropy bit”, since the revolution in electronics today is storing information in the “spin” of an electron, or what is now called “spintronics”. Thus I don’t believe Landauer is even remotely close to the amount of energy per entropy bit of memory. That is, I don’t think the principle is false, but the “energy/bit” conversion factors of both Boltzmann and Landauer are undoubtedly wrong for modern information storage.
"Energy/bit" conversion factors don't come into it unless you're converting entropy to/from thermal form (in which case the temperature determines the conversion factor). If you think in terms of entropy, and apply the relevant formula (Boltzmann et al), the conversion factors for different kinds of entropy become pretty obvious. (Although as I said, the state-counting can be quite difficult.) (That, and the fact that most systems' phase spaces don't factor cleanly, which means their entropy can't be cleanly split into different components -- I'll get back to this point.)
Until we get a better theory than Boltzmann’s permutation to ideal gas law, we are probably wise to use Granville’s suggestion of conserving entropy separately for each inconvertible quantity.
How on earth can it be wise to use something we know is seriously wrong? If entropy were conserved seperately for each type of entropy, no gas could ever be compressed, or condensed into a liquid, or frozen to a solid (since all of these convert configurational entropy to thermal) (with a caveat I'll get to in a bit). Sediment settling to the bottom of a lake violates this separate conservation idea, as does a huge amount of biochemistry. My extensiveness argument means that it "rules out" biological reproduction and growth along with evolution. This is NOT a wise thing to assume. It's certainly not a wise thing to base an argument against evolution on. All anyone has to do to refute you is to point out that your argument is based on a false premise, and you're done. Now, about that caveat: splitting entropy into different types (e.g. thermal vs. configurational) is only really possible in certain situations, most notably in a classical ideal gas (finally, something that's actually restricted to them!). In real gasses and solids and especially liquids, the thermal and configurational degrees of freedom aren't independent (or even clearly defined), so you can't cleanly split the total entropy into different types. So when I say that condensation converts configurational entropy into thermal, I'm really just making vague gestures about quantities that aren't actually well-defined. Does that caveat help Sewell's case? No, for two reasons. First, because if the various types of entropy aren't well-defined, his claim is something even worse than wrong, it's meaningless. And second, because while his X-entropies bear a superficial similarity to the configurational entropies for specific elements, they aren't actually the same thing (see olegt's comments starting here). This means that when e.g. gaseous nitrogen condenses into a liquid, its nitrogen-entropy is decreasing, but it's not being converted into anything else, it's just decreasing; and this does not conflict with the second law because the second law doesn't apply to nitrogen-entropy at all.Gordon Davisson
January 27, 2013
January
01
Jan
27
27
2013
03:48 PM
3
03
48
PM
PDT
Is it just me, or is the concept of a "backwards running process" incoherent?Mung
January 26, 2013
January
01
Jan
26
26
2013
12:27 PM
12
12
27
PM
PDT
"So what is it, precisely, that there is more of? Is it a physical substance? What’s it made of? How does it come to be that whenever anything happens in the universe, there is an increase in the number of microstates in the universe?"[sic] I've been lead to believe that what is increased when ever we read or hear that "entropy is increased", can be heat, non-heat energy, or some combination. The "amount" of either would be commensurate with the the amount of energy expended in performing work of some sort. I don't think that this is a very complete understanding, as my days of studying "mechanics" at university are definitely in the past. Could you use Joules as the scale to measure the physical quantity? I think the answer is yes. I don't know that it is the only applicable scale however. I think it would be interesting to determine that the "physical substance" of which there is "more of" ,as you put it, whenever "anything happens in the universe"[sic] corresponded with an amount of increase in the quantity of either "dark matter" or "dark energy" in the universe. Though, I've recently read that a number of physicists consider that "dark matter" and "dark energy" are likely the same. Such that whenever "anything happens"[sic] the expansion rate of the universe grows by an amount commensurate with the increase in entropy. I don't suspect that will be the case, but it is an interesting notion none-the-less. Don't you think?ciphertext
January 16, 2013
January
01
Jan
16
16
2013
12:33 PM
12
12
33
PM
PDT
Thanks Collin.
When anything ever happens in the universe, the net effect is that there is more entropy in the universe itself. (1:25)
So what is it, precisely, that there is more of? Is it a physical substance? What's it made of? How does it come to be that whenever anything happens in the universe, there is an increase in the number of microstates in the universe?Mung
January 11, 2013
January
01
Jan
11
11
2013
03:55 PM
3
03
55
PM
PDT
Lecture on entropy and the 2nd law. http://www.khanacademy.org/science/physics/thermodynamics/v/entropy-intuitionCollin
January 11, 2013
January
01
Jan
11
11
2013
12:54 PM
12
12
54
PM
PDT
A couple more simple questions: Would you want your teenager going off to university thinking that entropy and disorder were the same, or that entropy and disorder were inversely proportional? Would you want your teenager going off to university thinking that Shannon entropy was measured in joules? If a parent goes into the ordered room of the teenager and tosses it into a mess, does it take more entropy or less entropy than it took for the teen to order it?Mung
January 10, 2013
January
01
Jan
10
10
2013
08:10 PM
8
08
10
PM
PDT
KF@42 (Rob Sheldon here.) Thanks for remembering my comment. It is exactly what is going on in this comment thread. M@12 suggest some sort of incommensurate entropies, which he elaborates in #14 claiming that GS has mixed up his definitions, and that "order" is the wrong definition. Boltzmann & Shannon both used "order" or permutations in their definition of entropy. Clausius, Maxwell etc, used heat and temperature. Boltzmann's "ansatz" was to connect the two definitions with the eponymous constant. Landauer repeats the Boltzmann "ansatz" using computer bits instead of permutations, and several published authors have claimed to validate or invalidate Landauer's ansatz. Personally I think Landauer was using the "ideal gas" estimate of Boltzmann when he reused "k" for his "energy per entropy bit", since the revolution in electronics today is storing information in the "spin" of an electron, or what is now called "spintronics". Thus I don't believe Landauer is even remotely close to the amount of energy per entropy bit of memory. That is, I don't think the principle is false, but the "energy/bit" conversion factors of both Boltzmann and Landauer are undoubtedly wrong for modern information storage. All this to say, that Granville is completely correct when he says that the increase in entropy of the Sun is never shown to come even close to explaining the decrease of entropy on the earth. The conversion constants are just not known. In principle, they could be known, but in practice we are far, far from even a rough estimate. Until we get a better theory than Boltzmann's permutation to ideal gas law, we are probably wise to use Granville's suggestion of conserving entropy separately for each inconvertible quantity. There was some nonsense in GD@32,33 about "compensation" and "extensible" entropy. Even an ideal gas has cases when the entropy is not extensible (additive), but certainly coherent systems, systems with long-range forces are demonstrably non-extensible. There's even a large coterie of physicists proposing "Tsallis non-extensible entropy" as the solution to life, evolution, and the philosopher's stone. The mere fact that people are not collections of ideal gas atoms, but coherent and "functional" should be strong evidence that applying entropy addition to humans is wrong as applying Boyle's Law to my pot of caramel bubbling on the stove. In fact, the coherence of all the objects alluded to by Granville Sewell in his paper on computers and jet planes, is precisely the sort of order that cannot be measured by Boltzmann's ideal gas approximations. You are free to propose your own favorite conversion between ideal gas entropy and designed artifacts, but I would venture a guess that it won't hold up to experiment very long. Granville's common sense is a whole lot more profound than Wikipedia and an introductory physics text, and "compensation" remains an almost completely metaphysical belief.Robert Sheldon
January 10, 2013
January
01
Jan
10
10
2013
11:25 AM
11
11
25
AM
PDT
But sir, that's our Primordial Soup. We see the spontaneous generation of flies from our Primordial Soup all the time.Mung
January 8, 2013
January
01
Jan
8
08
2013
09:55 AM
9
09
55
AM
PDT
Per your #19, BA: '“In all,” argue Tompa and Rose, “an average protein would have approximately 3540 distinguishable interfaces,” and if one uses this number for the interactome space calculation, the result is 10 followed by the exponent 7.9 x 10^10.,,, the numbers preclude formation of a functional interactome (of ‘simple’ life) by trial and error,, within any meaningful span of time. This numerical exercise…is tantamount to a proof that the cell does not organize by random collisions of its interacting constituents. (i.e. that life did not arise, nor operate, by chance!)' Don't be nasty, BA... And you a Christian. Shame on you! Next you'll be saying there was already a fly in the soup.Axel
January 8, 2013
January
01
Jan
8
08
2013
05:20 AM
5
05
20
AM
PDT
Gordon, you state: "just because something is thermodynamically allowed, does not mean that it’s actually possible; it just means that it’s not thermo that forbids it." Okie Dokie, my bad for not catching that caveat.,, But Dr. Sewell hasn't ever said that thermo forbids replication or the origin of life has he? He has, to the best of my knowledge said that thermo makes the origin of life and 'vertical' evolution extremely unlikely. notes: Physicist Rob Sheldon offers some thoughts on Sal Cordova vs. Granville Sewell on 2nd Law Thermo - July 2012 Excerpt: The Equivalence: Boltzmann’s famous equation (and engraved on his tombstone) S = k ln W, merely is an exchange rate conversion. If W is lira, and S is dollars, then k ln() is the conversion of the one to the other, which is empirically determined. Boltzmann’s constant “k” is a semi-empirical conversion number that made Gibbs “stat mech” definition work with the earlier “thermo” definition of Lord Kelvin and co. Despite this being something as simple as a conversion factor, you must realize how important it was to connect these two. When Einstein connected mass to energy with E = (c2) m, we can now talk about mass-energy conservation, atom bombs and baby universes, whereas before Einstein they were totally different quantities. Likewise, by connecting the two things, thermodynamics and statistical mechanics, then the hard rules derived from thermo can now be applied to statistics of counting permutations. This is where Granville derives the potency of his argument, since a living organism certainly shows unusual permutations of the atoms, and thus has stat mech entropy that via Boltzmann, must obey the 2nd law. If life violates this, then it must not be lawfully possible for evolution to happen (without an input of work or information.) The one remaining problem, is how to calculate it precisely. https://uncommondescent.com/intelligent-design/physicist-rob-sheldon-offers-some-thoughts-on-sal-cordova-vs-granville-sewell-on-2nd-law-thermo/ "Klimontovich’s S-theorem, an analogue of Boltzmann’s entropy for open systems, explains why the further an open system gets from the equilibrium, the less entropy becomes. So entropy-wise, in open systems there is nothing wrong about the Second Law. S-theorem demonstrates that spontaneous emergence of regular structures in a continuum is possible.,,, The hard bit though is emergence of cybernetic control (which is assumed by self-organisation theories and which has not been observed anywhere yet). In contrast to the assumptions, observations suggest that between Regularity and Cybernetic Systems there is a vast Cut which cannot be crossed spontaneously. In practice, it can be crossed by intelligent integration and guidance of systems through a sequence of states towards better utility. No observations exist that would warrant a guess that apart from intelligence it can be done by anything else." Eugene S – UD Blogger https://uncommondescent.com/genetics/id-foundations-15c-a-faq-on-front-loading-thanks-to-genomicus/comment-page-1/#comment-418185bornagain77
January 8, 2013
January
01
Jan
8
08
2013
04:10 AM
4
04
10
AM
PDT
'The argument from Darwinists that pouring raw energy into a open system makes evolution inevitable is simply ‘not even wrong’ as an argument. Raw energy destroys rather than builds functional complexity:' Bornagain, re your #21, the Darwinists probably inadvertently omitted, ''n' stuff' ... 'pouring raw energy 'n' stuff'. That would presumably cover the required control and direction agency. And they're really Iders, who've kind of lost their way. On the other hand, they could just be incorrigible dolts. I wonder which?Axel
January 7, 2013
January
01
Jan
7
07
2013
11:53 PM
11
11
53
PM
PDT
Gordon Davisson @32:
That means that the total entropy change involved in going from 200 individuals of species A + 100 individuals of species B to 100 A’s and 200 B’s, is the same as the entropy change in going from 100 A’s to 100 B’s. So if the boundary conditions of Earth allow the entropy change required for a population shift, they also allow for the entropy change required for one species to evolve into another.
Hmmm. Very interesting thought. It's late for me so I think I'll sleep on that one tonight.Eric Anderson
January 7, 2013
January
01
Jan
7
07
2013
11:14 PM
11
11
14
PM
PDT
Granville Sewell @31:
. . . the alternative is that the four unintelligent forces of physics alone must have rearranged the fundamental particles of physics into books, computers, cars, trucks and airplanes. You don’t even need to discuss the second law at all.
Agreed.Eric Anderson
January 7, 2013
January
01
Jan
7
07
2013
11:09 PM
11
11
09
PM
PDT
ba77:
If 1.No replication of biological life possible then Gordon graciously grants 2.No Origination of biological life possible
No, my argument only addresses what is allowed and forbidden by the second law. As i said at the end of #32: just because something is thermodynamically allowed, does not mean that it’s actually possible; it just means that it’s not thermo that forbids it.
Thus 3.Only falsification Gordon will accept as correct is if biological Gordon did not exist.
Again, no. In the first place, this is the opposite of what you said earlier ("if your argument was correct you would not be here to make the argument"). In the second place, I would accept someone pointing out an error in my thermodynamics or reasoning (provided it actually was an error). In the third place, while my existence does pretty much confirm that reproduction is possible (and hence thermodynamically allowed), it neither confirms nor refutes the parallel I drew between reproduction and the origin of life. (BTW, I should probably note that I did skip a few details when I drew the parallel. For one thing, I didn't take individual variation into account [e.g. larger individuals will tend to have more entropy]. For another, I didn't take the information-theoretic contribution to total entropy into account. But I don't see any way that either of these invalidates my argument, they just complicate it.)
i.e. mighty big of you!
It's not a question of graciousness or pettiness, it's a question of getting the physics and logic right.Gordon Davisson
January 7, 2013
January
01
Jan
7
07
2013
07:55 PM
7
07
55
PM
PDT
What is entropy made of? Numbers?Mung
January 7, 2013
January
01
Jan
7
07
2013
07:37 PM
7
07
37
PM
PDT
"I have no idea how that follows from my argument. Can you explain your reasoning?" If 1.No replication of biological life possible then Gordon graciously grants 2.No Origination of biological life possible Thus 3.Only falsification Gordon will accept as correct is if biological Gordon did not exist. i.e. mighty big of you!bornagain77
January 7, 2013
January
01
Jan
7
07
2013
06:05 PM
6
06
05
PM
PDT
,,, But did you happen to notice that if your argument was correct you would not be here to make the argument?
I have no idea how that follows from my argument. Can you explain your reasoning?Gordon Davisson
January 7, 2013
January
01
Jan
7
07
2013
05:48 PM
5
05
48
PM
PDT
as to: 'So if the boundary conditions of Earth allow population growth, they also allow population origination.' So you hold that if the boundary conditions didn't allow for biological life to replicate then you would then grant that the boundary conditions would prevent the origination of life? Mighty big of you! ,,, But did you happen to notice that if your argument was correct you would not be here to make the argument?bornagain77
January 7, 2013
January
01
Jan
7
07
2013
04:50 PM
4
04
50
PM
PDT
Dr. Sewell @28:
Ever since I showed how silly this compensation argument is (primarily here) [...]
You haven't shown that compensation is silly; in fact, your AML paper actually shows compensation happening. For example, anywhere ∇•J is positive, you'll get a decrease in the local entropy density (compensated by an increase elsewhere). Similarly, if the right-hand side of inequality #5 is negative, you have an entropy increase outside the system, which allows (i.e. can compensate for) an entropy decrease inside the system (the left-hand side of inequality #5). Compensation is entirely real. Compensation happens anytime you have a heat/matter/etc flow from one place to another. You can think of this as entropy flowing from one place to another (along with the heat/matter/whatever), but that's just a different way of describing the same thing. To the extent that your argument depends on rejecting compensation, your argument depends on rejecting reality.
[...] I can’t seem to find anyone who thinks the second law has anything to do with tornados or evolution or other unquantifiable applications, and people like Eric Anderson seem to imply I’m the only person who ever thought it did.
When you find that everyone else disagrees with you, you really should consider that maybe you're wrong and everyone else is right. (It doesn't necessarily mean that you are wrong, but you should at least consider the possibility.) Especially when the best argument you can muster for your view amounts to "well, I can't actually do the math, but it seems intuitively obvious that..."Gordon Davisson
January 7, 2013
January
01
Jan
7
07
2013
04:45 PM
4
04
45
PM
PDT
Eric Anderson @25:
Interesting thoughts. Let’s assume for a moment that you are correct that the 2nd Law applies to informational entropy and that the entropy can be measured in the Shannon sense. What this suggests to me is that the 2nd Law is not really the place on which to focus our attention, because Shannon entropy is largely irrelevant to what we are interested in when we discuss design (to wit, a highly meaningful and functional sequence of 1?s and 0?s can have the same Shannon “information” as the same 1?s and 0?s mixed up in a meaningless jumble). (There have been myriad prior UD threads regarding Shannon information.)
I'd agree with this, but...
Thus, Dr. Sewell’s focus on the 2nd Law seems to be, at best, tangentially related to the kind of functional complex specified information we are interested in for purposes of design. And, unfortunately, focusing on this Shannon kind of “information” also leads to unfruitful discussion of words like “order” and “disorder” (as already seen on this thread).
At least as I understand it, Dr. Sewell's argument doesn't have anything to do with Shannon entropy. In fact, he seems to reject any connection between Shannon entropy and thermal entropy -- in his paper, "Poker Entropy and the Theory of Compensation" (mentioned here, although the link to the paper seems to be dead), he rejects as nonsense the idea that "poker entropy" (which is actually an instance of Shannon entropy) should have anything to do with thermal entropy. Sewell's argument instead relates to X-entropy, where X is carbon or something like that (he never actually says which he thinks are relevant), and suffers from the fundamental problem that the second law doesn't apply to different types of entropy separately, but only to the total. Since there's a huge amount of thermal entropy leaving Earth (see my calculation of the entropy flux here), the second law allows that (for example) a huge amount of carbon-entropy could be being converted to thermal entropy, and then leaving Earth in that form. (Actually, I'm pretty sure the actual rate of entropy change of Earth is quite small, and that the huge amount of entropy leaving Earth is mostly cancelled by a similarly huge rate of entropy being produced on Earth. But the second law doesn't require this -- the second law doesn't say anything about the rate of entropy production, only that the rate of entropy destruction is zero.) BTW, there's another way to approach the conclusion that Earth's boundary conditions are sufficient to allow evolution and/or the origin of life: entropy is what's known in the biz as an extensive quantity, meaning that it's proportional to the amount of stuff we're talking about. For instance, two gallons of water has (other things being equal) twice the entropy of a single gallon of water. Similarly, the entropy of 200 individuals of species A is twice the entropy of 100 individuals of species A. That means that the total entropy change involved in going from 200 individuals of species A + 100 individuals of species B to 100 A's and 200 B's, is the same as the entropy change in going from 100 A's to 100 B's. So if the boundary conditions of Earth allow the entropy change required for a population shift, they also allow for the entropy change required for one species to evolve into another. The same argument applies to the origin of life as well. The entropy change for species A to expand from 100 individuals to 200 individuals is the same as the entropy change for a population of 100 indivuals to emerge from ... 0 individuals. So if the boundary conditions of Earth allow population growth, they also allow population origination. (Now, I should clarify that just because something is thermodynamically allowed, does not mean that it's actually possible; it just means that it's not thermo that forbids it. Which means that thermo -- like Shannon entropy -- is irrelevant to ID.)Gordon Davisson
January 7, 2013
January
01
Jan
7
07
2013
04:40 PM
4
04
40
PM
PDT
Eric, The main point of the video, made in the last minute or so, is that if you DON'T believe in ID, the alternative is that the four unintelligent forces of physics alone must have rearranged the fundamental particles of physics into books, computers, cars, trucks and airplanes. You don't even need to discuss the second law at all. Once so stated, most people immediately recognize the absurdity of an explanation without ID. Except for scientists, who immediately start looking for other examples of entropy changes where it is more difficult to say what the second law predicts, or for reasons to argue that, technically, the second law was not violated, or...Sigh, it seems to be completely impossible to get scientists to understand a concept this simple.Granville Sewell
January 7, 2013
January
01
Jan
7
07
2013
03:17 PM
3
03
17
PM
PDT
Dr. Sewell: Thank you for your comments. Please don't misunderstand my comments to be an attempt to refute (what I think is) your underlying argument. I am not sure I have a clear enough picture of what is being proposed to make that assessment. I have said above (and previously on UD) that the "Earth-is-an-open-system" argument is one of the stupidest arguments ever put forward. On a earlier thread I even expressed surprise that you were having to spend so much energy refuting the argument. A moment's reflection by the person of even average intelligence should be adequate to understand that it is an absurd position to take in support of the alleged evolutionary storyline. That said, if there are lots of people still making that argument then, by all means, I am glad that you are continuing your efforts to disabuse them of the notion. My concern -- or maybe 'concern' is too strong; perhaps 'unease' -- with couching a design discussion in terms of thermodynamics is that understanding thermodynamics may be necessary, but is not sufficient, to understanding the kind of functional complex specified information we see in life. Kind of like demonstrating that gravity is relevant to living systems. Of course it is; but it doesn't tell us much beyond that. As a result, even if someone were to abandon their silly "Earth-is-an-open-system" talking point, it is still extremely easy for them to fall back on the time-worn formula of chance + time = the improbable. Furthermore, the examples of thermodynamic processes you cite from physics textbooks, while showing that the concept of the 2nd Law can be applied broadly, do not really provide any insight regarding the origin of the underlying systems. Let me give an example of what I mean: Let's say that a building is being constructed and, when nearly completed, a tornado comes through and destroys it (we're not talking actual annihilation of matter here of course, rather the conventional sense of breaking it apart and scattering the components to the wind). This can be viewed as an increase in entropy (decrease in "order"). Fine, as far as it goes. But the tornado could also in the same manner destroy, say, a pile of construction materials near the building site, or even the pile of dirt left from the foundation excavation. Now we could spend a lot of time discussing with people whether the tornado caused an increase in entropy generally, whether that was compensated elsewhere by a decrease somewhere in the universe, whether the system is closed, whether the system is open, and so on. But none of it gets to the real heart of the issue, which is that the building was characterized by functional complex specified information. The fact that a thermodynamic process subsequently acted on a physical item (building, pile of materials, pile of dirt) tells us essentially nothing about whether the thing in question contained functional complex specified information in the first place, and consequently, whether the thing was designed. Now, one may object and say that the building was originally more "ordered" than the pile of dirt and so, therefore, the tornado caused more disorder in the case of destroying the building. Fine. That is just a weakness of example, not substance. Instead of a pile of dirt, let's propose something highly ordered, like a bed of crystals. Then one might further say, "Yes, but the kind of order we are talking about with the building is different from the kind of order we are talking about in a crystal." To which I respond: "Exactly. Precisely my point." So ultimately, when the dust clears (either from our tornado or from our discussion) and everyone comes to happy agreement on the relevance of thermodynamic processes and the 2nd Law to the system in question, we are still required to determine -- as an independent inquiry, without the need to invoke the 2nd Law -- whether the thing in question (building, pile of dirt, crystals) contained functional complex specified information or not. And it is these indicia of design that we are most interested in, not whether something is more or less "ordered" or whether something is subject to the grinding, relentless influence of the 2nd Law over time (we can stipulate that every physical system is). In summary, to the extent that people need to be disabused of their idea that "Earth-is-an-open-system-and-therefore-anything-goes," I think your examples and efforts are valuable and worth pursuing. In terms of getting to an inference of design, I am less optimistic.Eric Anderson
January 7, 2013
January
01
Jan
7
07
2013
11:18 AM
11
11
18
AM
PDT
Hi Granville, Thanks very much for the link to the video and the ENV article in your response (#13 above) to my question. They were very helpful. Thanks again.vjtorley
January 7, 2013
January
01
Jan
7
07
2013
11:16 AM
11
11
16
AM
PDT
Most every general physics text that discusses the second law cites examples of its application that are difficult to quantify, such as a wine glass breaking, books burning, or tornados destroying a town. But they, and most everyone else who discussed the topic, all agreed that while evolution represents a decrease in "entropy", this decrease is "compensated" by increases outside the Earth, hence there is no problem with the second law. Ever since I showed how silly this compensation argument is (primarily here ), I can't seem to find anyone who thinks the second law has anything to do with tornados or evolution or other unquantifiable applications, and people like Eric Anderson seem to imply I'm the only person who ever thought it did.Granville Sewell
January 7, 2013
January
01
Jan
7
07
2013
09:56 AM
9
09
56
AM
PDT
Quote of note from preceding: "In fact, our research suggests that these natural PPCs can achieve 'hot and fast' energy transfer – energy flows that prevent complete cooling to the temperature of their surroundings – which has been proposed as a way of improving solar cell efficiency beyond limits currently imposed by thermodynamics." ,,,bornagain77
January 7, 2013
January
01
Jan
7
07
2013
09:02 AM
9
09
02
AM
PDT
Semi OT: Unlocking nature's quantum engineering for efficient solar energy - January 7, 2013 Excerpt: Certain biological systems living in low light environments have unique protein structures for photosynthesis that use quantum dynamics to convert 100% of absorbed light into electrical charge,,, Research from Cambridge's Cavendish Laboratory studying light-harvesting proteins in Green Sulpher Bacteria – which can survive at depths of over 2,000 metres below the surface of the ocean – has found a mechanism in PPCs that helps protect energy from dissipating while travelling through the structure by actually reversing the flow of part of the escaped energy – by reenergising it back to exciton level through molecular vibrations.,,, "Some of the key issues in current solar cell technologies appear to have been elegantly and rigorously solved by the molecular architecture of these PPCs – namely the rapid, lossless transfer of excitons to reaction centres." As Chin points also out, stabilising 'quantum coherence', particularly at ambient temperatures – something the researchers have begun to explore – is an important goal for future quantum-based technologies, from advanced solar cells to quantum computers and nanotechnology. "These biological systems can direct a quantum process, in this case energy transport, in astoundingly subtle and controlled ways – showing remarkable resistance to the aggressive, random background noise of biology and extreme environments. "This new understanding of how to maintain coherence in excitons, and even regenerate it through molecular vibrations, provides a fascinating glimpse into the intricate design solutions – seemingly including quantum engineering – ,,, and which could provide the inspiration for new types of room temperature quantum devices." http://phys.org/news/2013-01-nature-quantum-efficient-solar-energy.htmlbornagain77
January 7, 2013
January
01
Jan
7
07
2013
08:59 AM
8
08
59
AM
PDT
Gordon Davisson @17: Interesting thoughts. Let's assume for a moment that you are correct that the 2nd Law applies to informational entropy and that the entropy can be measured in the Shannon sense. What this suggests to me is that the 2nd Law is not really the place on which to focus our attention, because Shannon entropy is largely irrelevant to what we are interested in when we discuss design (to wit, a highly meaningful and functional sequence of 1's and 0's can have the same Shannon "information" as the same 1's and 0's mixed up in a meaningless jumble). (There have been myriad prior UD threads regarding Shannon information.) Thus, Dr. Sewell's focus on the 2nd Law seems to be, at best, tangentially related to the kind of functional complex specified information we are interested in for purposes of design. And, unfortunately, focusing on this Shannon kind of "information" also leads to unfruitful discussion of words like "order" and "disorder" (as already seen on this thread). ----- I think Dr. Sewell had put forward some helpful examples of processes that don't come about by chance. I also think he may have some valuable insights into how his examples relate to evolution and design. I'm just not sure yet what those are or how best to articulate them. I'm also not sure that couching his argument in terms of the 2nd Law is the right approach, because -- to date at least -- it has resulted primarily in semantic disagreements, rather than discussion of the underlying substance.Eric Anderson
January 7, 2013
January
01
Jan
7
07
2013
08:52 AM
8
08
52
AM
PDT
Dr. Sewell @15 and Mung @16: As I said, this is largely a semantic exercise (and your comments highlight this fact). Assuming for sake of argument that Dr. Sewell has valid points relating to the systems he is describing, if (i) he insists on describing them in terms of the "Second Law of Thermodynamics" and (ii) his opponents disagree that what he is talking about even relates to the 2nd Law, then there is no common ground for discussion. This is why so many of these discussions result in talking past each other. Incidentally, a couple of months ago I had a profound epiphany that for me brought all this 2nd Law discussion into clear focus. I would like to share with you that epiphany. Unfortunately I've since forgotten what it was! :( EricEric Anderson
January 7, 2013
January
01
Jan
7
07
2013
08:28 AM
8
08
28
AM
PDT
1 2

Leave a Reply