Uncommon Descent Serving The Intelligent Design Community

Shannon Information, Entropy, Uncertainty in Thermodynamics and ID

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

This essay is intended to give a short overview of textbook understanding of Shannon Information, Entropy, Uncertainty in thermodynamics and Intelligent Design. Technical corrections are welcome.

The phrases “Shannon Information”, “Shannon Uncertainty”, “Shannon Entropy” are all the same. The most familiar every day usage of the notion of Shannon Information is in the world of computing and digital communication where the fundamental unit is the bit.

When someone transmits 1 megabit of information, the sender is sending 1 megabit of Shannon information, and the receiver is getting a reduction of 1 megabit of Shannon Uncertainty, and the ensemble of all possible configurations of 1 megabits has 1 million degrees of freedom as described by 1 megabit of Shannon entropy.

The word “Information” is preferred over “Uncertainty” and “Entropy” even though as can be seen they will yield essentially the same number.

The probability of any given configuration (microstate) can be converted by the following formula:

2-I = P

conversely

I = -log2P

Where I is the information measure in bits and P is the probability of finding that particular configuration (microstate).

Examples:
The Shannon information of 500 fair coins heads is 500 bits, and the probability of a given configuration of the 500 coins is

2-I = 1 out of 2500

But there are subtleties with the 500 coin example. We usually refer to the Shannon entropy in terms of heads/tails configuration. But there are other symbols we could apply Shannon metrics to.

For example, we could look at the 4 digit year on a coin and consider the Shannon entropy associated with the year. If we use an a priori assumption of all possible years as equally probable, I calculate the Shannon Entropy as:

log2 104= 13.28771 bits

But this assumption is clearly too crude since we know not all years are equally probable!

Further we could consider the orientation angle of a coin on a table based on the rounded whole number of degrees relative to some reference point. Thus each coin has 360 possible orientation microstates corresponding to 360 degrees. The number of bits is thus:

log2360 = 8.49185 bits

Notice, the problem here is we could choose even finer resolution of the orientation angle to 0.1 degree and thus get 3600 possible microstates! In that case, the Shannon information of any given orientation is:

log2360 = 11.81378 bits

If for example I focused on all the possible configurations of the orientations of each coin (rounded to a whole number of degrees) of a system of 500 coins, I’d get:

log2360500 = 500 log2360 = 4245 bits

The point of these is examples is to show there is no one Shannon Entropy to describe a system. It is dependent on the way we choose to recognize the possible microstates of the system! 😯

If we choose to recognize the energy microstates as the chosen microstates to calculate Shannon entropy, that is the thermodynamic entropy of the system expressed in Shannon bits versus the traditional thermodynamic entropy measure in Joules/Kelvin.

Consider a system of 500 pure copper pennies. The standard molar thermodynamic entropy of copper is 33.2 Joules/Kelvin/Mol. The number of mols in a pure copper penny is .0498 mols thus the thermodynamic entropy of 500 coins is roughly on the order of:

S = 500 * 33.2 Jolues/Kelvin/Mol * 0.0498 mols = 826.68 J/K

where S is the thermodynamic entropy.

We can relate thermodynamic entropy SShannon expressed in Shannon Bits to traditionally-expressed thermodynamic entropy number SBoltzmann in Joules/Kelvin. Simply divide SBoltzmann by Boltzmann’s constant (KB) and further divide by ln(2) (to adjust from natural log to log base 2 information measures):

SShannon = SBoltzmann / KB / ln(2) =

(826.68 J/K ) / (1.381x 10-23 J/K) / .693147 = 8.636 x 1025 Shannon Bits

Conclusion:
Q. How much Shannon Entropy is in a system of 500 fair pure copper pennies?
A. Depends on the way you choose to recognize the microstates and associated probability of each microstate of the system! 😯
500 bits, 4245 bits, 8.636 x 1025 bits, etc. would be correct answers depending on how the observer chooses to recognize microstates and associated probability of each microstate!

How does this all relate to ID? I’ll let the readers decide….. 🙂

But my personal suggestion, tread information theory and thermodynamics with caution. If you can’t do calculations comfortably as shown above, you might want to reconsider using information and thermodynamic arguments. Expect your opponents in debate to demand you provide similar calculations to defend your claims of ID. That’s why I’ve stressed ID proponents use basic probability arguments, not information theory and thermodynamic arguments. For that matter, I recommend Jonathan Wells’ Humpty Dumpty argument. Keep It Simple Soldier (KISS).

NOTES

0. There has been debate about the validity of using of Boltzmann’s constant which is related to ideal monoatomic gases to determine the degrees of freedom in solids or non monoatomic gases. If temperature is the energy per degree of freedom, then it seems reasonable to use one constant for estimating the degrees of freedom based on temperature and energy alone. A diatomic gas for example will have more degrees of freedom, but as far as temperature and energy, you’ll get a given number of empirically measured microstates, so it seems one can generally use Boltzmann’s constant. At least that is my reading of the literature.

1. Standard Molar Entropies
2. Mole Calculations
3. Boltzmann’s Constant
4. Nat Measure of Information
5. Creation Evolution University: Clausius, Boltzmann, Dembski

Comments
[…] – For a brief introduction to ID and information theory, go here. Bill Dembski, who founded this blog by the way, discussed Conservation of Information in Seattle (later in August at U Chicago). See also: Shannon Information, Entropy, Uncertainty in Thermodynamics and ID […]July 2014: Events that made a difference to ID | Uncommon Descent
December 30, 2014
December
12
Dec
30
30
2014
10:00 AM
10
10
00
AM
PDT
Eric:
All it means is that when we consider a system we need to carefully define what system we are talking about. The same holds true for thermodynamic systems, biological systems, informational systems, and so on.
Hubert P. Yockey:
...the word entropy is the name of a mathematical function, nothing more. One must not ascribe meaning to the function that is not in the mathematics. - Information Theory, Evolution, and the Origin of Life. p. 29
Hubert P. Yockey:
To use Shannon entropy as a measure of uncertainty and choice, we must first say about what we are uncertain. - Information Theory, Evolution, and the Origin of Life. p. 30
I think I made that last point in a prior thread, but it seems to fit in here as well.Mung
July 27, 2014
July
07
Jul
27
27
2014
05:01 PM
5
05
01
PM
PDT
R0bb @16: I have extreme respect for Bill Dembski, and his work has obviously been pivotal to the design paradigm, in particular his efforts to further the concept of complex specified information. For the most part I agree with the thrust of his efforts. I have noted over the years a few inconsistencies and nuances that I believed merited attention, including those that I outlined in a painfully-long and somewhat-esoteric critique I wrote about a decade ago which can be accessed here (http://www.researchgate.net/publication/26409309_Irreducible_Complexity_Reduced_An_Integrated_Approach_to_the_Complexity_Space).* It was in fact on Bill's kind personal suggestion and encouragement that I finalized my critique into a formal essay and submitted it for publication. In my assessment, in the past many people have misunderstood Bill's efforts to come up with a rigorous, mathematically-sound basis for the complexity underlying design, and have mistakenly assumed that he was attempting to quantify design per se. I have heard rumblings and rumors that he is proceeding down the path of complete quantification, but I will withhold judgment until I have had a chance to personally read his latest work, including his upcoming book. I am obviously not in any position, either by personal qualifications or position in the design movement, to demand that Bill accede to any particular viewpoint of mine. That said, if Bill is in fact departing from his earlier efforts and is claiming that design can be completely subsumed in a mathematical calculation -- some kind of algorithm whereby we can drop an artifact into a calculator that will spit out a calculated numeral definitively pointing to design or non-design -- if he is going down that path, then, yes, I will respectfully disagree with him, and vocally so. However, for now, my experience has been that Bill is more nuanced and careful than his false-caricature critics would like to believe, so I am hopeful that whatever efforts he is making toward quantifying the design inference will in fact at the end of the day be understood as only a part of the design inference process and will not, I trust, disagree with anything I have said on the topic. ----- * Incidentally, part of the thrust of my critique focused on an issue that is similar to what Sal is driving at in the OP above, namely, properly and adequately defining the system in question is critical to our assessment of its complexity, specification and, therefore, the inference to design.Eric Anderson
July 26, 2014
July
07
Jul
26
26
2014
08:02 PM
8
08
02
PM
PDT
Eric, good points. But like you say, you've been away. It was actually Salvador who was demanding calculations, from me. Which is the reason for the sarcastic comment I made @1, the only comment of mine in the thread, by the way, that Sal has not deleted.Mung
July 26, 2014
July
07
Jul
26
26
2014
03:42 PM
3
03
42
PM
PDT
Eric:
The problem arises when an ID proponent thinks that complex specified information can be calculated. It cannot.
It's a real problem when the ID proponent is named William Dembski and he says things like "There is a calculation to be performed. Do the calculation. Take the numbers seriously." And he provides mathematical formulas for calculating the specificity and the specified complexity, as if these things could be calculated. Hopefully you can set him straight, Eric.R0bb
July 26, 2014
July
07
Jul
26
26
2014
01:43 PM
1
01
43
PM
PDT
But my personal suggestion, tread information theory and thermodynamics with caution. If you can’t do calculations comfortably as shown above, you might want to reconsider using information and thermodynamic arguments. Expect your opponents in debate to demand you provide similar calculations to defend your claims of ID.
You make an interesting point, and certainly raise a valuable note of caution. I would argue, however, that the real issue is not that some stubborn ID opponent might demand real calculations (gasp!), but that the calculations are not the real issue. One need not be an expert in thermodynamics or information systems to be able to do a simple back-of-the-envelope calculation of the probabilities. It doesn't matter if the calculation is off by several orders of magnitude for the kinds of systems we are talking about. The point of talking about, say, the odds of a particular nucleotide sequence arising by chance is -- contrary to the silly assertion by some outspoken (and misguided) opponents like Elizabeth Liddle -- not that we must have a definitive, be-all-and-end-all calculation that everyone can agree on before we can even start to consider the design inference. No, the point of doing a basic probability calculation is to give us a sense of the scope of the problem, and to draw a line in the sand for what the probabilistic resources of the known universe can reasonably be expected to accomplish. All the nitpicking around the edges amounts to nothing more than a rounding error against the awful probabilities that beset a naturalistic creation scenario. The problem arises when an ID proponent thinks that complex specified information can be calculated. It cannot. True, at some level complexity can be calculated - via the types of Shannon and thermodynamic calculations you describe,* but not the specified aspect of CSI, which is where the real meat is. So the issue is not that we should avoid talking about calculations because someone might demand that we actually provide them, but rather that the relevant calculations are trivial and are, frankly, largely beside the point. I have yet to see a single individual who has any genuine interest in understanding ID who demands hyper-technical calculations of various microstates of nucleotides or amino acids or other constituent building blocks of life. Nor have I ever seen a single anti-ID proponent who has come to the table and walked away saying, "Gee, I just can't quite come to grips with ID. If that probability calculation had another zero on the end, then I'd be swayed. I think the odds of this particular biological system coming about via undirected natural causes is 1x10^-100, but if you can show me that it is actually 1x10^-105, well then you'll have me -- I'll agree that design is the best explanation and I'll become an ID proponent." It just doesn't happen that way. And the reason it doesn't, is that the real issues that keep people from accepting ID have nothing to do with particular calculations or mathematical proofs. Those calculations and proofs are important to provide legitimacy to what ID is talking about, but they are already far beyond what is necessary or required to cause any reasonable individual to sit up and take notice. ----- Well, that is much too long. I think I agree largely with the thrust of your point about being cautious with wading into Shannon and thermodynamic calculations. I just wish to emphasize that this is not so much because ID proponents are incapable of defending good calculations when asked for them, but because an ID proponent who thinks that her position can be proved mathematically has missed a large part of what ID is about. Design is ultimately about function and intent and purpose and creativity and communication and interaction between beings than it is about any calculation of bits or joules. ----- * Again, these need not be hypertechnical or overly-complicated. The kinds of systems we typically look at as examples of design in living systems surpass the universal probability bound by so many orders of magnitude that the "complexity" aspect is virtually a given. It is the specification where the rubber meets the road (and that, I argue, is not a calculable entity), coupled with the question of whether purely natural processes can produce such specification via chance or via some alleged filter like natural selection.Eric Anderson
July 26, 2014
July
07
Jul
26
26
2014
12:32 PM
12
12
32
PM
PDT
Sal, thanks for posting this. Good topic. I just flew in this morning from a trip and have been away for a while. Hopefully I can parse this a bit more in the coming days. One thing jumped out at me on a quick glance:
The point of these is examples is to show there is no one Shannon Entropy to describe a system. It is dependent on the way we choose to recognize the possible microstates of the system!
This is true, but essentially trivially so. All it means is that when we consider a system we need to carefully define what system we are talking about. The same holds true for thermodynamic systems, biological systems, informational systems, and so on. Your point is well taken, but it is essentially just a call for clarity of discussion. When someone (like Dembski or Meyer) uses a coin toss example to help people understand Shannon Information, it is unhelpful for the critic to get all hyper-technical about all the many other aspects of a coin (year it was made, weight, material, deviation from ideal roundness, etc.). Those are red herring complaints that just distract from the real issues. [Note: I'm not suggesting you have ever taken this approach, but I have seen discussions descend into angels-on-the-head-of-a-pin arguments about which, if any, specific microstates we should be taken into account for Shannon purposes. It is largely beside the point.] The primary thing we need to know about so-called Shannon Information, is that it really isn't meaningful for purposes of design detection, or for ID generally.Eric Anderson
July 26, 2014
July
07
Jul
26
26
2014
12:04 PM
12
12
04
PM
PDT
Groovamos: An energy burst impinging on a target can be characterized by an entropy measure. It is not easy find on the web how to calculate this entropy of an energy signal. Anyone want to surmise an application related to this energy signal entropy determination? Davisson: * It’s also important to realize that Shannon entropy isn’t really a property of a particular signal, it’s a property of the probability distribution that governed its choice. Very little congruence there. I wonder why.groovamos
July 25, 2014
July
07
Jul
25
25
2014
02:05 PM
2
02
05
PM
PDT
Yes, he’s my grandfather (and Owen Willans Richardson my great-uncle).
WOW!!!!!!!!!!!!!!!!!!!!!!!! The world of science is a better place for your grandfather and great-uncle's work. It would not be unusual for a 1st year science student studying Chemistry 101 hearing of Davisson-Germer! For the reader's benefit (from wiki):
Clinton Joseph Davisson (October 22, 1881 – February 1, 1958), was an American physicist who won the 1937 Nobel Prize in Physics for his discovery of electron diffraction in the famous Davisson-Germer experiment.
The Davisson–Germer experiment was a physics experiment conducted by American physicists Clinton Davisson and Lester Germer in the years 1923-1927,[1] which confirmed the de Broglie hypothesis. This hypothesis advanced by Louis de Broglie in 1924 says that particles of matter such as electrons have wave like properties. The experiment not only played a major role in verifying the de Broglie hypothesis and demonstrated the wave-particle duality, but also was an important historical development in the establishment of quantum mechanics and of the Schrödinger equation.
and of this Knighted scientist:
Sir Owen Willans Richardson, FRS (26 April 1879 – 15 February 1959) was a British physicist who won the Nobel Prize in Physics in 1928 for his work on thermionic emission, which led to Richardson's Law.[1]
Following J. J. Thomson's identification of the electron in 1897, the British physicist Owen Willans Richardson began work on the topic that he later called "thermionic emission". He received a Nobel Prize in Physics in 1928 "for his work on the thermionic phenomenon and especially for the discovery of the law named after him". In any solid metal, there are one or two electrons per atom that are free to move from atom to atom. This is sometimes collectively referred to as a "sea of electrons". Their velocities follow a statistical distribution, rather than being uniform, and occasionally an electron will have enough velocity to exit the metal without being pulled back in. The minimum amount of energy needed for an electron to leave a surface is called the work function. The work function is characteristic of the material and for most metals is on the order of several electronvolts. Thermionic currents can be increased by decreasing the work function. This often-desired goal can be achieved by applying various oxide coatings to the wire.
scordova
July 25, 2014
July
07
Jul
25
25
2014
12:53 PM
12
12
53
PM
PDT
Yes, he's my grandfather (and Owen Willans Richardson my great-uncle). Unfortunately, both died before I was born, so I never got to meet them...Gordon Davisson
July 24, 2014
July
07
Jul
24
24
2014
10:27 PM
10
10
27
PM
PDT
Gordon, I'm not aware of any current policy restriction on linking to TalkOrigins. I linked to your essay in a separate discussion here in gratitude for all of your helpful comments. https://uncommondescent.com/physics/gordon-davissons-talk-origins-post-of-the-month-october-2000/ PS A question I've been meaning to ask for a long time: Are you any relation to the nobel prize winner Clinton Davisson of Davisson-Germer fame? My professor James Trefil while teaching at UVa used the same office as Clinton Davisson. Trefil's desk still desk had the periodic table taped to it years earlier by Clinton Davisson.scordova
July 24, 2014
July
07
Jul
24
24
2014
08:52 AM
8
08
52
AM
PDT
[…] I made the derivation in two places: Shannon Information, Entropy, Uncertainty in Thermodynamics and ID […]Gordon Davisson’s Talk Origins Post of the Month (October 2000) | Uncommon Descent
July 24, 2014
July
07
Jul
24
24
2014
08:45 AM
8
08
45
AM
PDT
Even atheists themselves, who break ranks with the Darwinian ‘consensus’ party line, are severely castigated by Darwinian atheists. There was even a peer-reviewed paper in a philosophy journal by a materialist/atheist that sought to ostracize, and limit the free speech of, a fellow materialist/atheist (Jerry Fodor) who had had the audacity, in public, to dare to question the sufficiency of natural selection to be the true explanation for how all life on earth came to be. Darwinian Philosophy: "Darwinian Natural Selection is the Only Process that could Produce the Appearance of Purpose" - Casey Luskin - August, 2012 Excerpt: In any case, this tarring and feathering of Fodor is just the latest frustrated attempt by hardline Darwinians to discourage people from using design terminology. It’s a hopeless effort, because try as they might to impose speech codes on each another, they can’t change the fact that nature is infused with purpose, which readily lends itself to, as Rosenberg calls it “teleosemantics.” http://www.evolutionnews.org/2012/08/blind_darwinian063311.html Also see the atheistic response to Nagel's 'Mind and Cosmos', The Heretic – Who is Thomas Nagel and why are so many of his fellow academics condemning him? – March 25, 2013 http://www.weeklystandard.com/articles/heretic_707692.html These are not isolated cases of intimidation, but is a general, systematic, trend in Academia by Darwinists for censorship of opposing views (especially censorship of design views): “In the last few years I have seen a saddening progression at several institutions. I have witnessed unfair treatment upon scientists that do not accept macroevolutionary arguments and for their having signed the above-referenced statement regarding the examination of Darwinism. (Dissent from Darwinism list)(I will comment no further regarding the specifics of the actions taken upon the skeptics; I love and honor my colleagues too much for that.) I never thought that science would have evolved like this. I deeply value the academy; teaching, professing and research in the university are my privileges and joys… ” Professor James M. Tour – one of the ten most cited chemists in the world https://uncommondescent.com/intelligent-design/a-world-famous-chemist-tells-the-truth-theres-no-scientist-alive-today-who-understands-macroevolution/ EXPELLED - Starring Ben Stein - video http://www.youtube.com/watch?v=P-BDc3wu81U Slaughter of Dissidents - Book "If folks liked Ben Stein's movie "Expelled: No Intelligence Allowed," they will be blown away by "Slaughter of the Dissidents." - Russ Miller http://www.amazon.com/Slaughter-Dissidents-Dr-Jerry-Bergman/dp/0981873405 In the court system, we find the same pattern of censorship of free speech by Darwinists: On the Fundamental Difference Between Darwin-Inspired and Intelligent Design-Inspired Lawsuits - September 2011 Excerpt: *Darwin lobby litigation: In every Darwin-inspired case listed above, the Darwin lobby sought to shut down free speech, stopping people from talking about non-evolutionary views, and seeking to restrict freedom of intellectual inquiry. *ID movement litigation: Seeks to expand intellectual inquiry and free speech rights to talk about non-evolutionary views. http://www.evolutionnews.org/2011/09/on_the_fundamental_difference_050451.htmlbornagain77
July 24, 2014
July
07
Jul
24
24
2014
05:35 AM
5
05
35
AM
PDT
Gordon Davisson you said: "I seem to recall that links to the talk.origins archives are forbidden here" Not to my knowledge. They, talk.origin cites, are just viewed with the same suspicion as say a Wikipedia citation on Intelligent Design would be. So cite away,,, As to censorship in general, instead of free intellectual inquiry, that would be the typical Darwinian modus operandi to suppress dissent: For example: An Interview with David Noble - Peer Review as Censorship by SUZAN MAZUR - 2010 Excerpt: SUZAN MAZUR: I’ve been focusing on abuse inside the peer review system in recent articles for CounterPunch. The system seems to have spiraled out of control – to the extent that at the low end we now find virtual death squads on Internet blogs out to destroy scientists who have novel theories. They pretend to be battling creationism but their real job is to censor the free flow of ideas on behalf of the science establishment. The science establishment rewards bloody deeds like these by putting the chief assassin on the cover of The Humanist magazine, for example.,, http://www.counterpunch.org/2010/02/26/peer-review-as-censorship/ The censorship against ID many times will even extend, past internet blogs and Wikipedia, down into peer review and academia itself. The following is very informative for exposing that 'systematic bias' by Darwinists within peer review: "The Problem With Peer-Review" - Casey Luskin - February 2012 - podcast http://intelligentdesign.podomatic.com/entry/2012-02-28T10_10_16-08_00 How the Scientific Consensus is Maintained – Granville Sewell (Professor of Mathematics University of Texas – El Paso) – video http://www.youtube.com/watch?v=vRLSwVRdNes Censorship Loses: Never Forget the Story of Biological Information: New Perspectives Casey Luskin - August 20, 2013 http://www.evolutionnews.org/2013/08/censorship_lose075541.html ID theorist Mike Behe was refused a response in Microbe - September 22, 2013 https://uncommondescent.com/irreducible-complexity/id-theorist-mike-behe-was-refused-a-response-in-microbe/ The Letter that Science Refused to Publish - November 8, 2013 Excerpt: Stephen Meyer sought the opportunity to reply, in the pages of Science, to UC Berkeley paleontologist Charles Marshall, who reviewed Darwin's Doubt in the same publication. Without explanation, the editors refused to publish the letter. We offer it for your interest. See more at: http://www.evolutionnews.org/2013/11/the_letter_that078871.html etc.. etc.. If silencing by intimidation, or censorship, does not work, Darwinists simple 'EXPEL' anyone who disagrees with them: EXPELLED - Starring Ben Stein - video http://www.youtube.com/watch?v=P-BDc3wu81U Slaughter of Dissidents - Book "If folks liked Ben Stein's movie "Expelled: No Intelligence Allowed," they will be blown away by "Slaughter of the Dissidents." - Russ Miller http://www.amazon.com/Slaughter-Dissidents-Dr-Jerry-Bergman/dp/0981873405bornagain77
July 24, 2014
July
07
Jul
24
24
2014
05:35 AM
5
05
35
AM
PDT
If you don’t object to the way I did the conversion from J/K to Shannon bits, then that is reassuring.
Actually, I'm fairly comfortable going even further, and treating them as just different units for the same basic thing. From my earlier essay:
The choice of a base for the logarithm in Shannon's formula is essentially arbitrary, except that it determines the units of the result. Base 2 is traditional, because it gives the result in bits, which are the most popular units for measuring information. But they're not the only legitimate unit. If you happen to want the entropy in trits, just use base 3; for decimal digits, use base 10; for nats, use base e (natural log, just like the Boltzmann formula). And if you want the result in Joules per Kelvin you can use base e^(7.243e22), or take the result in nats and multiply by k. Either way, you'll get the same result you would've from Boltzmann's formula. (Actually, if you want me to do this units stuff properly, let me claim that information, temperature, and energy are dimensionally related: temperature = energy / information; 1 Kelvin = 9.57e-24 Joule/bit; and Boltzmann's constant is properly written k = 1.38 e-23 J/K*nat = 9.57e-24 Joule/Kelvin*bit = 1. If you allow that, I can use the same units for information and thermo-entropy without blinking.)
...but then, I'm also one of those people who think a light-year is just a year measured in an unusual (spacelike) direction, and thus that the speed of light in vacuum is 1. No units, just 1.Gordon Davisson
July 23, 2014
July
07
Jul
23
23
2014
11:15 PM
11
11
15
PM
PDT
Excellent comments Gordon, and thank you for the feedback. I concur with your comments and corrections. If the information sent is same as that received (true at least in terms of the coin system where transmission errors are not a consideration between sender and receiver), then it seems the amount of information transmitted by the sender is the amount of uncertainty reduced for the observer and in this special case would also equal the amount of Shannon entropy. I'll make that qualification in the future, and since this was intended mostly a pedagogical essay, I'll point out these simplifying assumptions are provided to help conceptualize the main ideas. The assumption is not unreasonable, since in everyday experience, we expect e-mails received have the data that was actually sent even though at the level of physical connection, bits were lost and corrupted and had to be corrected or retransmitted to give the receiver the message intended by the sender.
* Finally, I’m not familiar with any debate about generalizing Boltzmann’s constant from ideal gasses to … pretty much anything. It works normally (at least as far as I’m aware) for calculating the residual entropy of disordered solids at absolute zero. For example carbon monoxide, where each molecule can be in either of two orientations (carbon-oxygen or oxygen-carbon), and the thermodynamic entropy comes out to approximately K_B * ln(2) * the number of molecules. (I remember hearing it’s actually a little less than that, which would indicate a weak correlation between the orientation of neighboring molecules). Similarly, Landauer’s principle indicates that the entropy cost of erasing a single bit of data (i.e. one bit of Shannon entropy) was K_B * ln(2). (And Charles Bennett’s proposed information-powered heat engine implies that the conversion is actually reversible.) All this is pretty far from the ideal gas case…
Most textbooks I'm familiar with don't even delve into the question since in laboratory practice, there doesn't seem much advantage in converting J/K to Shannon bits. The debate over K_B was mostly between me and friend in a casual discussion, so I expressed the situation inaccurately in my essay. If you don't object to the way I did the conversion from J/K to Shannon bits, then that is reassuring. But if anyone has an objection to the above conversion method using Boltzmann's constant, please say so, and provide the conversion procedure you think is correct, and why. The way I did it seems to agree with the wiki article on Boltzmann's constant (you just have to do a little manipulation of the equation for S' in dimensionless units). But I'm open to hearing if there is something wrong in the way I calculated the Shannon bits.scordova
July 23, 2014
July
07
Jul
23
23
2014
11:00 PM
11
11
00
PM
PDT
BTW, I seem to recall that links to the talk.origins archives are forbidden here, so I'll just mention that an enterprising person could find a posting of mine on this subject (titled "Information and Thermo-Entropy") there.Gordon Davisson
July 23, 2014
July
07
Jul
23
23
2014
10:08 PM
10
10
08
PM
PDT
I mostly agree with this, but have a few comments: * I'm a little uncomfortable identifying Shannon entropy = Shannon information. Shannon's theory (and its various extensions, properly referred to as statistical information theory) defines a number of measures relating to information; entropy is only one of them. For example, in a standard communication channel, the entropy of the source is appropriately thought of as a measure of how much information is being sent, and the entropy of the received signal is the amount received. But if you wanted to know how much of the received information was the same as that transmitted, you'd use the joint information of the transmitted and received signals (essentially a measure of statistical correlation), not entropy. And then you also have conditional entropies of one signal given the other, and then there's the surprisal associated with a particular signal... I'll also note that conditional entropies are particularly useful for measuring information you don't have, i.e. uncertainty. For that reason, some people think of it as the opposite of information; I prefer to think of it (when used this way) as the opposite of knowledge -- but still information. Are information you have and information you don't have opposites, or the same thing looked at from different perspectives? Anyway, the point is that I think "Shannon information" is a somewhat ambiguous term, and should be avoided in favor of something more specific. * It's also important to realize that Shannon entropy isn't really a property of a particular signal, it's a property of the probability distribution that governed its choice. One way I've seen this put is that Shannon entropy has more to do with what it could have been, than with what it actually is. Note that in the analogy with Boltzmann-Gibbs entropy, the system's macroscopic state ("macrostate") plays the role of the probability distribution (and therefore entropy is a property of the macrostate, not the system's actual specific state -- the microstate). There is some theoretical ambiguity in how one should define macrostates, and thus in how one should calculate their entropies. Note that this is different from the ambiguity you raised (which is essentially about how fine you should slice your microstates). * Speaking of which, AIUI the microstate ambiguity you describe is actually resolved by quantum mechanics; since that defines a smallest-possible-difference-between-(basis)-states, it therefore gives an unambiguous "correct" definition of a microstate. On the other hand, it complicates the living bejabbers out of the math (you have to work with density matrices instead of "simple" probability distributions. I've... not gotten the hang of them). * Finally, I'm not familiar with any debate about generalizing Boltzmann’s constant from ideal gasses to ... pretty much anything. It works normally (at least as far as I'm aware) for calculating the residual entropy of disordered solids at absolute zero. For example carbon monoxide, where each molecule can be in either of two orientations (carbon-oxygen or oxygen-carbon), and the thermodynamic entropy comes out to approximately K_B * ln(2) * the number of molecules. (I remember hearing it's actually a little less than that, which would indicate a weak correlation between the orientation of neighboring molecules). Similarly, Landauer's principle indicates that the entropy cost of erasing a single bit of data (i.e. one bit of Shannon entropy) was K_B * ln(2). (And Charles Bennett's proposed information-powered heat engine implies that the conversion is actually reversible.) All this is pretty far from the ideal gas case...Gordon Davisson
July 23, 2014
July
07
Jul
23
23
2014
09:58 PM
9
09
58
PM
PDT
An energy burst impinging on a target can be characterized by an entropy measure. It is not easy find on the web how to calculate this entropy of an energy signal. Anyone want to surmise an application related to this energy signal entropy determination?groovamos
July 23, 2014
July
07
Jul
23
23
2014
09:47 PM
9
09
47
PM
PDT
I can't even do a simple entropy calculation, so I am obviously not qualified to comment.Mung
July 23, 2014
July
07
Jul
23
23
2014
06:48 PM
6
06
48
PM
PDT

Leave a Reply