Home » Darwinism, Intellectual freedom, Mathematics » Breaking, breaking: ID-friendly math prof Granville Sewell gets apology and damages from journal

Breaking, breaking: ID-friendly math prof Granville Sewell gets apology and damages from journal

Math journal retracted one of our UD authors’ accepted article only because Darwinist blogger complained

Granville Sewell

A brief, lay-friendly, look at Sewell’s stifled paper is here. Comment on it’s significance here.

This just in: Granville Sewell on the controversy.

[This post will remain at the top of the page until 5:00 pm EST. For reader convenience, other coverage continues below. - UD News]

Here, John G. West reports (Evolution News & Views, June 7, 2011) that University of Texas, El Paso math professor Granville Sewell has receive an apology and $10,000 because Applied Mathematics Letters withdrew his article on the Second Law of Thermodynamics, just before publication, based on the say so of a Darwinist blogger:

Witness the brazen censorship earlier this year of an article by University of Texas, El Paso mathematics professor Granville Sewell, author of the book In the Beginning and Other Essays on Intelligent Design. Sewell’s article critical of Neo-Darwinism (“A Second Look at the Second Law”) was both peer-reviewed and accepted for publication by the journal Applied Mathematics Letters. That is, the article was accepted for publication until a Darwinist blogger who describes himself as an “opinionated computer science geek” wrote the journal editor to denounce the article, and the editor decided to pull Sewell’s article in violation of his journal’s own professional standards. 

Here, Discovery Institute lawyer Casey Luskin reflects on the public glee Darwin lobbyists indulged themselves in at that point.

The publisher of Applied Mathematics Letters (Elsevier, the international science publisher) has now agreed to issue a public statement apologizing to Dr. Sewell as well as to pay $10,000 in attorney’s fees.

Sewell’s lawyer Lepiscopo points out that in retracting Sewell’s article, Applied Mathematics Letters “effectively accepted the unsubstantiated word and unsupported opinion of an inconsequential blogger, with little or unknown academic background beyond a self-professed public acknowledgment that he was a ‘computer science grad’ and whose only known writings are self-posted blogs about movies, comics, and fantasy computer games.” This blogger’s unsupported opinion “trumped the views of an author who is a well respected mathematician with a Ph.D. in Mathematics from Purdue University; a fully-tenured Professor of Mathematics at the University of Texas–El Paso; an author of three books on numerical analysis and 40 articles published in respected journals; and a highly sought-after and frequent lecturer world-wide on mathematics and science.”

The journal’s editor even wrote a self-demeaning apology to the blogger, for having followed accepted professional standards. And now his journal has issued a public apology to prof Sewell instead.

As West suggests, the editor may have feared for his career, considering what happened to Smithsonian journal editor and evolutionary biologist Rick Sternberg, when he was driven out for publishing ID theorist Steve Meyer’s peer-reviewed article on the Cambrian explosion. More.

Some now ask whether, given a string of recent defeats, the Darwin lobby’s tactics are backfiring?

  • Delicious
  • Facebook
  • Reddit
  • StumbleUpon
  • Twitter
  • RSS Feed

42 Responses to Breaking, breaking: ID-friendly math prof Granville Sewell gets apology and damages from journal

  1. This is all very good but the bottom line is that this article has not been published.

  2. I would like to read the paper. I followed the link to it on the Applied Maths site but the paper was not there.

    Unfortunately, the video format is not for me if the only thing videoed is the paper just being read. I decided to stop watching as soon as I heard statements about the second law of thermodynamics being violated by evolution.

    Generally speaking, any argumentation about the second law being violated is just illiterate. This must never be mentioned as an argument in favor of ID because the reputation of ID will suffer from such argumentation.

    Entropy is a statistical characteristic and therefore may have fluctuations even in a closed system. Integrally though, the second law always holds.

  3. The article is here: http://www.math.utep.edu/Facul.....L_3497.pdf

    It would be a good idea to read the article BEFORE commenting, especially as your concerned are addressed.

  4. Thanks, Kyrilluk. I’ll have a look.

  5. ID “scientists” don’t do science. If they did, then they would be publishing in the scientific journals…

    Yeah…

    Gee, I wonder why there are so few ID scientists publishing in the journals…? Um…….

  6. Well, I have read it. It has some really good insights, notably about the rates of entropy import, for which I am grateful to the author. But I still think that all we can fairly assert is an extremely low probability. Low does not always mean 0. That’s it. Order fluctuations are possible without violating the second law e.g. in highly non-equilibrium systems whereby self-organisation is possible.

    In my opinion, what makes ID remarkable is reasoning about information, not about order. It is information (not order) that is evidence of intelligence agency.

  7. For those who have a hard time following the advanced math of Dr. Sewell, as I do,,,

    video;
    http://www.math.utep.edu/Facul.....ondlaw.htm

    ,,,for establishing ‘boundary conditions’ for open systems, this following video is a bit easier for the average person to understand as to why declaring a systen ‘open’ does nothing to alleviate the insurmountable problems that neo-Darwinists have with the second law:

    Evolution Vs. Thermodynamics – Open System Refutation – Thomas Kindell – video
    http://www.youtube.com/watch?v=HoQ-iokM7p0

    ======================

    * The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
    o Sir Arthur Stanley Eddington, The Nature of the Physical World (1915), chapter 4

    * A theory is the more impressive the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression that classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts.
    o Albert Einstein (author), Paul Arthur, Schilpp (editor). Autobiographical Notes. A Centennial Edition. Open Court Publishing Company. 1979. p. 31 [As quoted by Don Howard, John Stachel. Einstein: The Formative Years, 1879-1909 (Einstein Studies, vol. 8). Birkhäuser Boston. 2000. p. 1]

    * Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don’t have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.
    o Seth Lloyd, writing in Nature 430, 971 (26 August 2004)
    http://en.wikiquote.org/wiki/Thermodynamics

    ======================

    “there are no known violations of the second law of thermodynamics. Ordinarily the second law is stated for isolated systems, but the second law applies equally well to open systems.”
    John Ross, Chemical and Engineering News, 7 July 1980

    “…the quantity of entropy generated locally cannot be negative irrespective of whether the system is isolated or not.”
    Arnold Sommerfel, Thermodynamics And Statistical Mechanics, p.155

  8. ES:

    Please see here on what happens when probabilities go small enough.

    (There is such a thing as a practical zero for probabilities, i.e something sufficiently remote that it is not expected to be observed across the lifespan of the observed cosmos. 1 in 10^150 or so is a good enough threshold.)

    And that is another ID-supportive peer reviewed paper.

    BTW, ID is about complex specified information manifested in organisation of elements in narrow target zones of interest in large configuration spaces of possibilities. When we see things like that — e.g. this blog comment, for good reason we infer to design, not lucky noise and/or the same sort of mechanical necessity that makes a dropped heavy object reliably fall.

    GEM of TKI

  9. F/N: Are they going to publish the article, now, as a further correction for the harm done?

  10. Kairosfocus,

    Many Thanks for the link. I am aware of the universal probability bound. However, I do not know of any work that shows how the second law relates to this bound. I would appreciate further pointers.

  11. ES:

    That is what Abel is doing. Look at his remarks on Borel and building on that.

    The Wiki article on the Infinite Monkey Theorem is also — surprise! — insightful. That’s at 101 level.

    GEM of TKI

  12. 12
    Elizabeth Liddle

    I agree that the article should have been published, and I can see why it passed peer-review. I think it’s a very clarifying article. In particular, I like this:

    Of course, one can still argue that the spectacular increase in order seen on Earth does not violate the second law because what has happened here is not really extremely improbable. Not many people are willing to make this argument, however; in fact, the claim that the second law does not apply to open systems was invented in an attempt to avoid having to make this argument. And perhaps it only seems extremely improbable, but really is not, that, under the right conditions, the influx of stellar energy into a planet could cause atoms to rearrange themselves into nuclear power plants and spaceships and digital computers. But one would think that at least this would be considered an open question, and those who argue that it really is extremely improbable, and thus contrary to the basic principle underlying the second law of thermodynamics, would be given a measure of respect, and taken seriously by their colleagues, but we are not.

    Yes, that is the argument that must be made. And indeed, it’s exactly the argument that those of us who are not persuaded by ID do make.

    I happen to think it’s a good and well-supported argument, but good to see the point of disagreement pinpointed so elegantly.

  13. Dr Liddle:

    Please cf here on the open system thermodynamics issues.

    GEM of TKI

  14. @Eugene S: Resonning about “Order” or Entopy is the same as reasoning about Information.

  15. Spontaneous order can happen.
    For example, if you take some water in a glass at ambient temperature, there a definitive probability that all the fast molecules are going to gather at the top whereas the slower one gather at the bottom of glass, creating the particular event of a boiling water at the top and a freezing water at the bottom of your glass.

    This is not impossible.

    Entropy just tell us that this is highly improbable, so improbable in fact that this is considered as impossible from a practical point of view.

    So yes, there is some probability that somehow the right molecules “gathered” spontaneously at the right place and at the right time and that created the first proto-cell. It’s just so unlikely that from all practical purposes it’s impossible.

  16. Kairofocus,

    Solid stuff. Thanks.

  17. Kirilluk,

    Order =/= Information. So reasoning about one is not necessarily the same as reasoning about the other. Ordered systems are not always information rich. Consider a crystall which is much simpler than a living cell. Life is order and information. There are measures of information content such as Kolmogorov complexity which simply boils down to measuring how long a string of symbols describing something is in some sort of universal language in comparison with the object it describes.

    BTW, apparantly you live in the UK, in which case we are close neighbours :)

  18. Kirilluk,

    Further to my comment.

    Mathematically, everything can be represented as a string of symbols which we can describe using the same alphabet. E.g. a string A=”000000…” containing 1000 “0″-s requires much fewer symbols than its own length. In contrast, pi computed with 1000 digit accuracy admits no descriptions shorter than itself. So its complexity is higher: K(pi) > K(A).

    We should be careful in distinguishing between impossible and implausible becuase scientifically almost nothing is impossible :)

    Everyday experience tells us that high enough information complexity is indicative of intelligent agency. We can say by induction that observable life is complex enough and consequently it has been designed. We cannot test this assertion but it is falsifiable.

  19. 19
    William J. Murray

    Yes, as long as there is the bare chance that incoming solar energy and cosmic rays could miraculously decrease organizational entropy into spectacular accomplishments of increased informational and material order like spaceships and encylopedias, that’s all the darwinist/materialist requires to feel satisfied that the 2LoT isn’t being violated in any manner that requires serious examination.

    And so, they can just brush aside Sewell’s work with their double-standard catch-phrase: “If you haven’t proven evolution impossible under the 2LoT, shut the hell up”.

  20. semi OT:

    New Research on Epistatic Interactions Shows “Overwhelmingly Negative” Fitness Costs and Limits to Evolution
    http://www.evolutionnews.org/2.....47151.html

    i.e. Another Day, Another Extremely Bad Day For Neo-Darwinists, Genetic Entropy,,, wins 10^10^123,,, loses 0 !!! :)

  21. H’mm:

    By that standard, I have a perpetual motion machine [second kind] to sell to you . . .

    NOT.

    That is, there is a point where the statistical results on accessible microstates in clusters accumulate to a point where the result is practically certain.

    GEM of TKI

  22. 22
    Granville Sewell

    I would like to point out that my article does not actually mention ID or Darwinism, and does not even conclude that the second law has definitely been violated here, it just makes the rather obvious point that if you want to believe it has not, you have to be honest enough to admit that what you believe is that what has happened here is not really extremely improbable. You can’t hide behind the absurd “compensation” argument as Asimov and many others do.

  23. The excuse to continue to not publish the article [HT: J G West, ENV]:

    ________________

    >> Unfortunately, Applied Mathematics Letters did not agree to reinstate Sewell’s article, although it did grant Dr. Sewell the right to continue to post online the digital pre-print version prepared by the journal. The explanation for not reinstating Sewell’s article is hard to credit. The journal insists that the editor “concluded that the content was more philosophical than mathematical and, as such, not appropriate for a technical mathematics journal.” That’s right, weeks after Sewell’s article had been peer-reviewed, accepted for publication, and published online, the journal editor suddenly had an epiphany that the article he had accepted was outside the subject area of his journal . . . >>

    ________________

    Oh, sure . . .

  24. Kairofocus,

    I would like to hear what you think about this.

    http://orthodoxchristian-blogg.....ation.html

    Ta.

  25. So yes, there is some probability that somehow the right molecules “gathered” spontaneously at the right place and at the right time and that created the first proto-cell. It’s just so unlikely that from all practical purposes it’s impossible.

    Well and good, except that that is not anything like how I have understood the subject. I may be enttirely wrong, but to me it appears that the opinion is that abiogenesis must have been an extensive chain of events; possibly both in outer space and here on earth.

    I believe that the idea of a ‘spontaneous coming together at the right place at the right time’ is something that one will not find in any science book. I have often encountered similar claims from creationist sources though. I just wonder why they are so persistent?

    I have a habit of searching for scientific sources when I want to know what science have to say about a particular subject. For the very interesting subject of OOL research, I last read “The Emergence of Life on Earth” by Iris Fry. Would anyone need 283 pages of text to describe a coming together event? With 16 pages of bibliography it makes me suspect a lot more has been written too.
    I am confident that much more has been learned and a lot more has been written about the subject since its publication in 1999 too.

  26. 26

    LOL

    Following this story around the net through the various links, I come across DvK’s review of Meyer’s Sig in the Cell.

    What a hoot.

    Hey David…

    “Q has to be followed by U” ??

    How absurdly weak! Wow David, you really got to the heart of the matter. Powerful stuff.

  27. While the journal clearly violated its own rules in cancelling the publication at the last moment I can understand why they still refused to publish. The article is silly and can be easily contradicted by obvious examples.

    Consider a tornado. How likely is it that air molecules would spontaneously form themselves into such a self-sustaining structure? By itself a tornado clearly violates the 2LOT, but the exchange of energy in the atmosphere between the sun’s heat and the condensing water vapor provide a sink for entropy so that the law is not violated.

    In effect a tornado is a “dissipative structure” which transfers energy and creates entropy. That Sewell seems to be ignorant of the literature on this topic is what so embarrassed the journal; being “mere” mathematicians they were ignorant of it as well.

    The mind boggles. Who supposedly reviewed this paper? Did Sewell go out of his way to pick people who were ignorant of the topic?

  28. ES:

    Quick skim.

    Typically Russian!

    (NB: I have a shelf or so of Russian tech, math and sci books in English, courtesy MIR. Loved Zeldovich [the janitor turned physicist], he of physics of explosions.)

    Interesting, and reflective of the is it 5 years of physics and 4 years of Calculus in High School all Russian students were subjected to back when, dunno if still so. I am not so sure that IC is the MAIN ID argument, but that is minor. I actually think the IC and the FSCI arguments are closely linked once we see that multipart co-tuning at an operating point is going to run past 1,000 bits of information in yes/no questions, very fast.

    His discussion on rolling into a pit, inverting the usual topology for Hill Climbing is instructive. (The inversion makes the Hill-climbing problem of local vs global maxima very clear — typically Russian again. All you gotta add in is flat, frictional plains . . . to model seas of non-function and finite resource “energy” to move about. [I used to talk about leaky rafts floating art random on the seas . . . ])

    Worth developing further.

    GEM of TKI

  29. PS: I should have looked at the profile! The flavour of thinking about minima and analysis is so typically Russian, I think it is stamped in in HS [!], but of course we are looking at the work of a PhD in Physics!

  30. It always bemuses me how science is no longer conducted within the laboratory, but in the courtroom.

  31. BTW, the paper won’t be reinstated, so it looks like the only winners are the lawyers. Again.

  32. And, Dr ES:

    My basic point in the App I my note is to look again at the sub components in Clausius’ chief example.

    The entropy on balance of the isolated system is at least constant, but the closed systems interacting through heat exchange show the key point as to how injection of energy not coupled to an energy conversion device ALSO tends to increase entropy.

    In this context shaft work can be seen as going to either order or organisation, depending on how it is coupled — a cam-bar with a stepper would do a program, as say Paley’s self-replicating watch from Ch II of his Nat Theol would exemplify. (His “baffles” suggest Coanda effect fluidic logic.)

    Order, here is specific and symmetrical or periodic but not complex, aperiodic as organisation tends to be. Organisation then leads to tightly identified, narrow zones in configuration spaces [which are of course cut down from phase and state spaces], and the inference to design on being in a special zone is closely parallel to being in the zone of phase space from the classic example of unmixing the O2 in a roomful of air; undoing diffusion.

    If you look at the online TMLO [download here], you will see that Thaxton et al develop the thought of in effect diffusion in a linear space — a long chain molecule. Bradley then did a version that moved to CSI and the information paradigm as opposed to the thermodynamics one. But the link between the two should not be denigrated, following Jaynes et al. Growing in recognition in recent years.

    the key issue is thsat we are now looking at the idea of an observed event E from a zone T in a space S, where S is sufficiently large that solar system or even cosmos scope resources will not suffice to sample enough to make it credible on random walk driven processes in the config space that feed into trial and error we will ever get to a point where we are at the zone T or even T1, T2, . . . Tn.

    In the case of 1,000 bits of stored info, the 1.07*10^301 possible configs make the ~ 10^150 states of 10^80 atoms for 10^25 s changing state every Planck time, hopelessly small a fraction of the space.

    So, if we see E from T in S, we are better advised to infer to design, even if we have no other direct evidence on whodunit or how tweredun.

    But of course, as that cuts across the strong beliefs of many, it is controversial.

    But it should not be, we are saying little more than if you see a blog post you infer to blogger, not lucky noise.

    Or, if you see a program in action that works, you infer to intentional and knowledgeable programming not lucky noise.

    Which of course brings us to the reduced version of the Dembski metric:

    Chi_500 – I*Z – 500, specific [Z = 1 if so, 0 otherwise] bits beyond the threshold of sufficient complexity.

    GEM of TKI

  33. OOPS:

    Chi_500 = I*Z – 500, specific [Z = 1 if so, 0 otherwise] bits beyond the threshold of sufficient complexity

  34. semi OT: One thing that has been made clear to me by Dr. Sewell’s trials, and many other ID advocates trials, is how dogmatic, and deceptive, neo-Darwinists can be in protecting their atheistic theory. For me this is a sure clue that there is far more going on here than meets the eye.,,, As John Lennox states in this video at 6:15 minute mark;

    John Lennox – Science Is Impossible Without God – Quotes – video remix
    http://www.metacafe.com/watch/6287271/

    “Yes Ladies and Gentlemen, what we are discovering is this; that there is a battle for our minds, and there are two worldviews.”

    ,,,I would go even farther than John Lennox did, and state that it not only is a battle for our minds, but is a battle for our eternal souls as well!!!

    Skillet – Awake and Alive – music video
    http://www.youtube.com/watch?v=2aJUnltwsqs

    Flyleaf – Chasm – music video
    http://www.youtube.com/watch?v=O-BvOuE7wfw

  35. Kairosfocus,

    Many Thanks for your comments and for your time. Of course, there is nothing new there, I just wanted to summarise all that I considered important in that most controversial issue.

  36. Kairosfocus,

    Many Thanks for your comments and for your time. Of course, there is nothing new there, I just wanted to summarise for myself all that I considered important in that most controversial issue.

  37. @Cabal: The ideas behind the word “gather” is that all the event leading to the creation of the first proto-cell are random and like for the molecules in a glass of water, are governed by the laws of physics. In order to break the second law speed limit, you need either a new law, not yet discovered, that would be able to produce genetic information (but then law and information are two contradicting process) or via an intelligent agent.

    @Eugene S: Yes I live in London. Do you live in Europe?

  38. Kyrilluk,

    I live in Ipswich, Suffolk, and work for a private company delivering logistics solutions and services. I have an academic past and I have recently become interested in Intelligent Design.

  39. 39
    Gordon Davisson

    I’ve now read Sewell’s AMS paper, and while I’m not a proper expert on thermodynamics, I see some pretty serious problems with it. I’l take a quick stab at explaining them here.

    1: Identifying order with negative entropy doesn’t really work.
    In his paper, Sewell defines order as negative entropy. This definition of order sounds plausible, since entropy is (sort of) a measure of disorder, and order is (sort of) the opposite of order, but negative entropy winds up not being much like our intuitive sense of what order is.

    For one thing, entropy has an absolute value (defined by the third law of thermo), and for real systems the entropy is always positive. This means that order (per Sewell’s definition) is always negative.

    For another, the entropy of several objects together is (generally) the sum of their individual entropies. For example, the entropy of a rock and a watch (nods at Paley) is the sum of the rock’s entropy an the watch’s entropy. This means that the entropy of rock+watch is greater than that of the rock alone, which means that the order of rock+watch is less than the order of the rock alone.

    Worse than that: a typical human has trillions of bacteria living in them, which means the entropy of a human is the sum of trillions of bacteria’s entropies plus the entropy of their non-bacterial parts. Which means that by Sewell’s definition, a human has far lower order than a single bacterium. This does not match at all with what I mean when I speak of “order”.

    (Intuitively, it seems to me that what’s going on here is that a human is both more ordered and more disordered than a bacterium — there’s more to a human than there is to a bacterium, so this isn’t a contradiction. Similarly, rock+watch should be considered both more ordered and more disordered than the rock alone.)

    2: The mathematical analysis only applies to things (heat, etc) diffusing through a solid.
    If anything other than that is involved — liquids, gasses, chemical reactions, etc — the math in the paper simply doesn’t apply. This would be survivable if the conclusions were true in general (although a better analysis would be needed to justify them), but they clearly are false in general. Let me give a simple example of a situation where Sewell’s results (the ones near the middle of page 3) don’t hold:

    Consider a jar containing nitrogen gas and some powdered graphite (a form of carbon). Let me start with the jar’s contents thoroughly mixed: the graphite is uniformly scattered throughout the volume inside the jar, and it and the gas is all at the same temperature.

    What happens if the jar is isolated (except for gravity), and just left to sit for a while? All of the graphite will settle to the bottom; its arrangement is then more ordered, and in fact the carbon entropy within the jar has decreased. This does not, however, violate the second law of thermodynamics (as I’ll explain in a bit).

    Something even more interesting happened to the distribution of thermal energy within the jar. As the graphite particles settle to the bottom, their gravitational energy is converted to kinetic energy (their downward motion), and then that’s converted to heat (both by friction as they fall, and the inelastic collisions when they hit the bottom of the jar). More of this heat is released near the bottom of the jar than the top, so the jar’s contents will become warmer at the bottom than the top. Intuitively (to me at least), this means that the heat has become more ordered. But the thermal entropy went up because there is more heat than at the beginning, and this outweighs the entropy decrease from nonuniformity.

    (As with human vs. bacterium, it seems to be that the best way to think of this is that since there’s more heat than there was at the beginning, it’s not a contradiction that the heat can be both more ordered and more disordered than it was at the beginning.)

    (BTW, the heat will eventually even out, removing the thermal order and increasing the thermal entropy even further.)

    So how does this fit with the second law of thermodynamics? It turns out that the increase in thermal entropy is larger than the decrease in carbon entropy, so the total entropy has increased, and the second law is satisfied. Sewell claims that the second law applies separately to each different kind of entropy, but this is not true in general. The only reason it works out that way in Sewell’s math is that there’s no coupling between the distribution of heat and carbon when they’re diffusing through a solid; anytime there’s any coupling between them, you can have conversion from one form of entropy to another.

    Tradeoffs between different forms of entropy is actually very common. Crystallization is an obvious example: a crystal has low entropy, but also low energy; as long as the energy difference (released as heat) corresponds to a large enough entropy increase to compensate for the inherent entropy decrease of crystallization, then crystallization will be thermodynamically spontaneous. The forms of the second law framed in terms of free energy are based on this tradeoff.

    (I should also note that it’s not always possible to define some specific types of entropy. As I discussed in this earlier comment, splitting the total entropy into separate components [thermal, carbon, etc] depends on separating the system’s degrees of freedom into categories [again, thermal, carbon, etc] and calculating the entropy relating to those degrees of freedom. If the system’s degrees of freedom can’t be separated and categorized, you can’t define the components of entropy. For example, if the jar in my example contained normal air rather than just nitrogen, the carbon entropy would’ve had to include the entropy of the carbon in carbon dioxide [about 0.04% of normal air]; but because that carbon can’t move independently of the oxygen, you can’t separate the carbon-related degrees of freedom from the oxygen-related degrees of freedom, so the carbon-entropy can’t be separated from the oxygen-entropy. Actually, even without CO2 the degrees of freedom are still not entirely separable, so the definitions of carbon-, and thermal-entropy are a bit fuzzy if you look too close. But don’t worry, the total is still well-defined, and that’s all that really matters thermodynamically.)

    Now, let me consider a small change to my example: rather than being isolated, suppose the jar is able to exchange heat with its surroundings (which I’ll assume are a heat reservoir — basically something big and inert that can absorb heat but not complicate matters by doing anything else). After the carbon settles to the bottom of the jar, the heat it released flows from the jar into its surroundings, leaving the jar’s thermal entropy back where it started. This doesn’t change things much, the only difference is that now the compensating increase in thermal entropy happens in the surroundings rather than the jar. This is essentially what Sewell’s equation #5 describes, although that specific equation only applies to heat transfer via diffusion through a solid; if the heat is transferred some other way, you need a different formula for the entropy flux. For near-equilibrium heat flows, it’s actually very simple: the entropy flux is the heat flux divided by the absolute temperature.

    BTW, there’s a very simple and (IMHO) intuitive way to describe this form of the second law: entropy can be produced, and can move around and change form, but cannot be destroyed. This isn’t technically correct, since entropy isn’t actually a thing (it’s a property), but as long as you remember it’s just a metaphor, it works quite well.

    3: The tautology section confuses improbability with second-law violations.
    First, I should note that the tautology itself (“if an increase in order is extremely improbable when a system is closed, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable”) is basically correct (that’s what a tautology is — something so trivially true that it doesn’t actually tell you anything), but I have some issues with how it’s phrased: it implicitly assumes that the only form of interaction the system might have with its surroundings is that something enters the system. In my example of the jar, nothing entered the jar at all — heat left it, but even there heat isn’t an actual “thing”, so nothing actually entered or left the jar. To be actually true, it needs to take into account all kinds of interactions between the system and its surroundings.

    But Sewell also makes a serious logic error in thinking about the consequences of the tautology: he assumes that if something is wildly improbable, it violates the second law of thermodynamics. The reverse is true: if something violates the second law, it is wildly improbable (effectively impossible), but Sewell’s version is not. For example, consider hydrogen fusion at low temperatures: it’s strongly favored by thermodynamics (even more so than at high temperatures), but doesn’t actually seem to happen at all (at least, under normal conditions). When Sewell says:

    Thus, unless we are willing to argue that the influx of solar energy into the Earth makes the appearance of spaceships, computers and the Internet not extremely improbable, we have to conclude that the second law has in fact been violated here.

    His conclusion simply does not follow from the premise. To support the conclusion, you’d have to calculate the Earth’s net entropy flux (sunlight in vs. thermal radiation out), and compare that to the entropy change needed for spaceships, computers, etc to appear.

    Sewell also makes a similar mistake near the end of the video, when he infers that information loss is a consequence of the second law of thermodynamics. Actually, information’s relationship with entropy is not that simple (see my discussion here). If fact, information erasure actually corresponds directly to an entropy decrease, and can only occur if it’s coupled to a larger increase (e.g. in thermal entropy, as per Laudauer’s principle).

  40. Now reported in THE Times Higher Education:
    Second thoughts result in payout

    The apology, which has been posted on the journal’s website, confirms that Professor Rodin withdrew the article without consulting Professor Sewell after concluding that its content was “more philosophical than mathematical”. The journal and Professor Rodin “provide their sincere and heartfelt apologies to Dr Sewell for any inconvenience or embarrassment”, it says.

  41. Gordon Davisson

    The heart of Sewell’s analysis is clarifying that “order” must cross the boundary into an open system for “order” (complex specified information) to appear within that system. That is in contrast to the formation of “order” not being possible within a closed system.

    Appreciate your effort. Now, does it have substance?

    If you have no comment on Sewell’s mathematics, then we take that as correct.

    Are you disputing in any way Sewell’s conslusions based on this math? Or only the semantics?

    “there heat isn’t an actual “thing”, so nothing actually entered or left the jar.”
    That is but sophistry and equivocation over the definition of entropy as non-negative and the common usage of “order” as also positive. Sewell clearly addresses thermal entropy and heat transfer. See Qt in equation (3). Heat flow is a transfer of energy. Physics treats energy as a “thing” as real as matter.
    For clear mathematical discussion Sewell explicitly distinguishes between the common use of “order” as positive, and the technical definition of “thermal order” and “X-order” as necessarily negative because entropy is by definition non-negative.

    Your comment “by Sewell’s definition, a human has far lower order than a single bacterium” is a similar equivocation addressing semantics rather than substance.
    Using “lower” rather than “greater” is but mixing the common meaning of “order” and “more” versus Sewell’s mathematical/physics meaning of “X-order” he defined as being negative and increasing in the negative direction.

    See Charles comments addressing some of the weaknesses in your comments on information.

    So I encourage you to grapple with the substance of Sewell’s development that external “order” (information)is necessary for such “order” to develop in an otherwise closed system.

  42. 42

    DLH:

    Are you disputing in any way Sewell’s conslusions based on this math? Or only the semantics?

    Both. I’m obviously much more concerned about the substance of his conclusions, but there are a few places where I think the semantics are confused enough to be important. I’ll try to clarify down my position. Issues of pure substance first:

    * Is Sewell’s math correct? Yes, but it doesn’t apply to the relevant situations, and hence cannot support any relevant conclusion. (More specifically: it only applies to diffusion through a solid; if there’s anything other than that involved, it’s inapplicable.)

    * Does the second law allow entropy decreases to be compensated by unrelated entropy decreases somewhere else? No, Sewell is correct here. His specific formula is only narrowly applicable, but my understanding is that the more general principal is both well established and well known.

    I’m used to seeing this in the form dS = dS_i + dS_e, where dS is the (infinitessimal) entropy change of the system, dS_i is the entropy produced inside the system, and dS_e is the entropy flux into the system (entropy in minus entropy out). In this formulation, the second law simply says that dS_i cannot be negative. Sewell cites an equivalent version at 12:29 to 13:05 in the video.

    * Does the entropy flux (/order/whatever) into/out of the system have to be the right kind to allow a particular kind of entropy decrease (/order increase/whatever) in the system? No, Sewell is wrong here. I gave an example of carbon entropy being converted into thermal entropy (and then leaving the system in that form); this sort of thing is entirely normal, and not restricted at all by the second law.

    * Does Sewell’s paper provide an argument against naturalistic evolution? Not that I can see. The tautology part doesn’t tell you anything you haven’t assumed going in: assume that the energy flows into & out of the Earth aren’t sufficient to allow evolution, and it tells you that they can’t allow evolution. Assume they are sufficient, and it tells you they can allow evolution. Without some way to tell what’s sufficient and what isn’t, there’s no way you can really justify either position. If Sewell had been right about different kinds of entropy being separate (see above) you could make a case here, but that part he’s wrong about.

    Now, for the mostly-semantic issues:

    * Can order be identified with negative entropy? I don’t see a way to make this work, for the reasons I gave (I’ll reply to your discussion on this in a bit). I think this is actually somewhat important, because calling negative entropy “order” is actively confusing and misleading. (Note: your identifying order with complex specified information confuses things even further; crystals are ordered, but that doesn’t mean they have CSI.)

    * Can different kinds of entropy (thermal entropy, carbon entropy, etc) be defined and reasoned about? This works under some circumstances, but not others; any reasoning based on separate kinds of entropy is going to fall apart under many circumstances.

    * Is “something … entering” all that can affect probabilities (/entropy/order/whatever) inside it? My issue with energy not being a “thing” is an unimportant quibble, and if “thing” is meant broadly I have no problem. Ignoring what’s leaving the system is, however, more important (energy entering a system almost never allows an entropy decrease; heat leaving generally does).

    * Can an improbability argument justify a claim of second law violation? I don’t think so. As Sewell pointed out, he later in the paper phrased this as “…violates the underlying principle behind the second law…”, but I think even that takes it too far. The second law is an extremely well-tested law of nature, and if you want to claim its support for your argument, your argument should follow from the version of the second law that’s been well-tested, not just something kind of similar.

    Finally, let me take a look at your response to my objections about “order” vs. entropy:

    For clear mathematical discussion Sewell explicitly distinguishes between the common use of “order” as positive, and the technical definition of “thermal order” and “X-order” as necessarily negative because entropy is by definition non-negative.

    I don’t see anywhere that Sewell does this. He does talk about its rate of change (i.e. entropy is increasing, therefore order is decreasing), but that’s different from its sign.

    Your comment “by Sewell’s definition, a human has far lower order than a single bacterium” is a similar equivocation addressing semantics rather than substance.
    Using “lower” rather than “greater” is but mixing the common meaning of “order” and “more” versus Sewell’s mathematical/physics meaning of “X-order” he defined as being negative and increasing in the negative direction.

    First, as I read Sewell, X-order is just a the common-meaning order of some particular part or property of the system (e.g. carbon-order, oxygen-order, thermal order, etc). X is just a placeholder for whatever specific kind of order is under consideration.

    As for the direction of increase, I’m afraid I can’t make sense of your explanation. Let me give a simple situation with some numbers attached, and see if that clarifies things. Suppose we have a rock that’s warm on one side and cold on the other, and that it has an entropy of 100 cal/K. If we isolate it and leave it for a while, the heat will even out. Obviously, this’ll increase its entropy; lets say its final entropy is 101 cal/K.

    What can we say about the rock’s order? As I read Sewell, he’d say that it has become less ordered (by 1 cal/K), which is fine. But his definition also implies that its order is negative (-100 cal/K at the beginning, -101 cal/K at the end).

    If I read you right (‘“X-order” he defined as being negative and increasing in the negative direction’), you seem to imply that since the rock’s “X-order” increased in the negative direction, it became more ordered as it evened out. Am I reading you right here?

Leave a Reply