Since in my last post a commenter put on the table thermodynamics to support evolution I decided to offer my personal answer in a specific post, although UD already dealt with this issue. As known, 2nd law of thermodynamics (SLOT, also called “entropy law”) states that in a closed system the overall energy entropy ΔS never decreases spontaneously (i.e. without an external intervention). Example: in a room (considered a closed system) a hot cup of coffee on a tabletop, loosing heat, decreases in energy entropy –ΔSc (neghentropy). Around the table the environment, absorbing heat, increases energy entropy ΔSe, in such manner that the overall energy entropy of the room ΔSr doesn’t decrease. In this example SLOT can be expressed with this formula:
ΔSr = ΔSe – ΔSc = ΔQe/Te – ΔQc/Tc >= 0 (measured in Joule/Kelvin)
ΔQx are amounts of heat and Tx are absolute temperatures.
In statistical mechanics it is used also another definition of entropy. The statistical entropy H of a system, given the number W of its microscopic configurations (or microstates), can be written as:
H = ln W (measured in bits, ln is base-2 logarithm).
To correlate somehow the two definitions of entropy, S (energy entropy) and H (statistical entropy), they write the statistical entropy in the Boltzmann’s form:
H = k * ln W (k is the Boltzmann’s constant 1.38 x 10^-23 J/K)
The constant k is introduced to match the measure units.
If a system A is more improbable, more complex, than a system B it means that its microstates Wa, consistent with the specification of A (chosen from a given universe of microstates U) are less than the microstates Wb, which meet the specification of B (chosen from the same universe of microstates U):
Wa < Wb
As a consequence the statistical entropy of A is less than the statistical entropy of B:
Ha < Hb
Smaller probability signifies more information (because the information of a sequence of characters or bits is inversely proportional to its probability to occur), then system A contains more information than B. In this sense statistical entropy is a measure of the lack of information (or ignorance of). According to another similar point of view, since more the microstates more the disorder, system A is said to be more ordered and system B more disordered. Along this line, statistical entropy becomes sort of measure of disorder (and statistical neghentropy a measure of order).
Given the above scientific scenario, evolutionists elaborated a flawed argument that sounds something like this: while, according to SLOT, the overall entropy of a closed system doesn’t decrease, however there can be entropy downwards fluctuations somewhere in the system. In short, while entropy doesn’t decrease globally, it can decrease locally. Since, according to the statistical definition of entropy, a decrease of entropy implies more information and complexity, evolution (intended as a continue process increasing them) is possible locally in the planet, also if the global entropy of the universe increases. However, to justify the continuity of evolution, they need something more than some rare fluctuations, they need an endless sequence of neghentropies. They believe that the cause or origin of these series of neghentropies could well be the Sun, which continually emits heat towards the Earth, allowing the continue biological evolution of organisms (from simple forms to ever more complex forms). The Daniel F. Styer’s paper "Entropy and evolution" makes this argument quantitative and shows that just a very little part of the entropy flux from the Sun is just sufficient to allow the evolution of all organisms arisen on Earth. However the author admits that his calculations allow or permit evolution, they do not require it. As we will see below, this article is not at all a proof of evolution because it considers evolution only from the energy viewpoint and not from the organization viewpoint (which is the essential one).
Below I will provide an explanation which will be in the same time a disproof of the above evolutionist argument and the ID argument from thermodynamics against evolution.
First off, a decrease of entropy, despite the fact it can be measured in bits, is not at all what Intelligent Design Theory (IDT) calls "Complex Specified Information" (CSI). The order of neghentropy is not CSI, which is very organization. Whenever and wherever only CSI can produce organization. This is a basic statement which Norbert Wiener, just before the arise of ID, expressed so: "The amount of information in a system is a measure of its organization degree" ("Cybernetics", Introduction). When we deal with organization we always have CSI. But the entropic order is not true organization and as such cannot account for the complexity of organisms, which are highly organized systems.
A misunderstanding that causes the evolutionist’s error is that statistical entropy, Shannon information and CSI can all be measured in bits. But the simple fact that two things can be measured or evaluated by the same unit doesn’t mean that they are the same thing or do the same work. A scientist and a porter are both paid in dollars but their jobs are different. Analogously the neghentropic bits are not bits of organization, rather bits of simple order. Eventually they can yet speak of information when they deal with entropy expressed in bits, but however these bits have nothing to do with the CSI of organization. As such they cannot account for the spontaneous generation of CSI systems as organisms. As an example, the immense organization of a biological cell has nothing to do with the simple order of a crystal (which generation implies decrease of entropy).
ID theory says that organization is different from the simple energy decrease in entropy because the former implies CSI while the latter doesn’t. In fact CSI is not simple information but information that is complex and specified. The question is: can the information related to a decrease in entropy event on the Earth be complex and specified? If this event were complex and specified it should be such also its cause (as a general principle, what is contained in the effect must be potentially in its cause). The cause, according to evolutionists, is the heat coming from the Sun. Hence the question becomes: can the information related to an energy flux from the Sun be complex and specified? We can admit that this energy flux is complex, but of the kind of complexity of a long random sequence. In fact we could for example convert the measured analog data stream of the solar energy flux by means of an ADC (Analog Digital Converter). Likely the sampled sequence of bits (obtained by the analog-digital conversion) is complex (of low probability). But sure this data stream is not specified, in the sense that IDT considers specification (predefined patterns). No particular predefined pattern (of the kind we see in the biological systems) is recognizable in an energy transfer from the Sun. To claim otherwise would mean that the energy transfer is someway "modulated" or "codified" according to pre-specified patterns (as radio/TV transmissions or the sound waves of a speech are): a clear absurdity. Lacking specification in no way the energy flux from the Sun conveys CSI to the Earth. To put it bluntly, the Sun sends energy, not organization. As a consequence the Earth-is-not-a-closed-system evolutionist objection (to escape the ID argument from thermodynamics) is not valid.
Another way to consider SLOT from an ID "no free lunch" perspective is: SLOT states that order cannot come from nothingness, order must always have a source or counterpart. It is also in this sense that SLOT supports ID and denies evolution. In fact if just order needs a source to greater reason organization (which has higher rank than order) does. In a system organization can increase if the system is not closed and an external CSI source inserts it into the system to increase its internal organization. This must be the case of the ID origin of life and of species on the Earth: an external intelligent source provided CSI.
In a sense, the evolutionist "Sun argument" means that CSI can be paid by simple energy. But energy cannot create CSI. For absurd, whether energy provided CSI, for example, software houses could think of not to pay expensive computer programmers, rather they would buy power plants; publishers wouldn’t pay writers, they would buy power supplies instead, and so on. Thermodynamics states that in any energy conversion, the quantity of energy is the same before and after the conversion, but the quality decreases. Never energy conversion is 100 percent efficient. In the thermodynamics processes there is quality degradation (entropy) of energy. But if energy quality decreases to greater reason organization cannot increase, which is far more qualitative than energy.