Home » Comp. Sci. / Eng., Philosophy, Religion » When designed errors are the perfect design

When designed errors are the perfect design

Shannon’s legendary paper: Mathematical Theory of Communication unwittingly lends support to the philosophical notion that perfect designs in one dimension must of necessity have imperfection in other dimensions.

We intuitively understand that we communicate much better with someone in a quiet room versus a noisy room. But Shannon’s genius was that he quantified this notion by relating maximum data transmission rate to the signal-to-noise ratio. The result of the paper was the now famous Noisy channel coding theorem.

To make his argument, he defined measures of information relevant to communication, the famous notion of “bit”. The notion of “bit” plays a central role in ID theories, but ironically, the notion of “bit” wasn’t the focus of Shannon’s legendary paper!

How does Shannon’s paper raise a philosophical notions that perfection in one dimension of necessity implies imperfection in others? Before answering, consider more accessible examples such as those I posted in The Shallowness of Bad Design Arguments:

The existence of bad design, broken design, and cruelty in the world inspires some of the strongest arguments against the Intelligent Design of life and the universe. I consider the “bad design” argument the most formidable of the anti-ID arguments put forward, but in the end it is shallow and flawed. I will attempt to turn the “bad design” argument on its head in this essay.

The “bad design” arguments have at least two major themes:

1. An Intelligent Designer like God wouldn’t make designs that are capable of breaking down

2. God (as the Intelligent Designer of Life) doesn’t exist because of all the cruelty and evil in the world

To address the first point, consider the synthesis of computer languages like: Java, C, C++, Ada, Pascal, Basic, FORTRAN, COBOL, Jovial, PL1, Modula-2, LISP, Prolog, etc.

The designers of these languages admit the possibility of syntax and semantic errors in the uninterpreted/uncompiled source code presented by programmers to a computer. Is it possible in principle to implement a computer language that is both non-trivial and capable of meaning while simultaneously impervious to software developers making errors (especially semantic errors)? I’d say no. And by way of extension, can there be a meaningful design without the potential for breakdown? Every example of engineering is vulnerable to breakdown. So, the hypothesis: “An Intelligent Designer like God wouldn’t make designs that are capable of breaking down” is rooted in pure theology, not in terms of any engineering experience. The potential for breakdown is the norm for intelligent design.

Furthermore, there is a rather peculiar property about reality. It seems appreciation for what is good is made possible by the existence of what is bad. Consider the Super Bowl where over 30 National Football League teams compete for the coveted title of Super Bowl Champions (the title went to the Saints a few years back, God bless them). But would such a title have any meaning if there were no losers in the NFL? This was an intelligently designed sport. It would be a flawed argument to say “the competitions leading to the Super Bowl are not intelligently designed because they result in losing teams”, yet the same sort of illogic is used by Darwinists to argue against ID.

How can we say an Intelligently Designed world would not admit the capacity for some to be at the losing end of a Divine Drama? We may not like it, but not liking something is not a justification for rejection of truth. I’ve often speculated the evil in this world might make meaningful the good in another world. This is not far from the thoughts of one insightful thinker who said almost 2000 years ago:

“For our light affliction, which is but for a moment, is working for us a far more exceeding and eternal weight of glory”

Paul of Tarsus
2 Cor 4:17

What Shannon realized is that the best way to transmit maximum bandwidth is to allow the communication channel to transmit errors! Yes that’s right, let it transmit errors, and then correct it later. One could transmit a message very very slowly across a communication channel such that it comes across with fewer mistakes than transmitting it quickly, but the price to pay is that the communication is slow or the representation of data isn’t usably compact.

The best way to do things in sending things through internet wires or radio waves is to send it through the channel, allow noise to introduce errors (more errors than would happen if sent slowly), and then use things like Error Correction Polynomials at the receiving end to correct the mistakes. In the case of DVD’s and CD’s, data is deliberately allowed to be recorded with a modest amount of corruption onto the DVD’s and CD in order to gain maximum compactness. Then when the DVD’s and CD are played back, the playback machines employ Reed Solomon Error Correction to cleanse the errors away. The result is perfection is gained through imperfection!

But these facts don’t preclude shallow philosophies from being used against ID. Jerry Coyne argues that something isn’t designed because it is “imperfect”. His reasoning sounds more like he’s arguing that an airplane is not an intelligent design because it’s not as thermodynamically efficient as a train or boat. Would he say DVD’s and CD’s are not intelligent designs because they are deliberately designed with errors! Would he say deliberate designs with errors are unwise?

I’ve heard Darwinist biologists effectively assert, “an all powerful Designer would not incorporate errors or the potential for errors in his designs”, but such viewpoints are pure theological speculation, not based on math or anything in engineering experience. Needless to say, the abundance of such shallow thinking in evolutionary biology makes me for once agree with Coyne:

In science’s pecking order, evolutionary biology lurks somewhere near the bottom, far closer to phrenology than to physics

And in science and technology’s pecking order, evolutionary biology is far closer to phrenology than to engineering and Shannon’s legendary paper which showed that designs perfect in one dimension must of necessity be imperfect in others.

NOTES:

1. “When designed errors are the perfect design” seems to echo Paul’s words and various other Christian themes as to why perfect Design could still be consistent with God allowing sin (errors into the world) and then later correcting them (through Christ’s atonement), and letting “momentary light affliction” to build “a weight of glory.”

2. thanks again to Denyse for encouraging more to post more frequently for one week

  • Delicious
  • Facebook
  • Reddit
  • StumbleUpon
  • Twitter
  • RSS Feed

16 Responses to When designed errors are the perfect design

  1. Thanks Sal. Worth thinking about.

  2. Shannon-Hartley theorem is stated in its most simple form and there are some assumptions in it such as noise spectral density is flat over the bandwidth of a channel. There is nothing about “maximum bandwidth” in this discussion. The discussion is all about determining maximum information transmission rate, measured in either bits/sec or symbols/sec. The forgoing assumes a FIXED bandwidth and a given maximum power output at the source, and is concerned with the conditions required to achieve this maximum information rate with these conditions. The Shannon-Hartley theorem can take more advanced forms to account for non-uniform power spectral density of both signal and noise, over a band of interest.

    I posted at #7 on the following thread, more about the wild attributes of maximum information rates: http://www.uncommondescent.com.....formation/

  3. Here is a link to the first paper Shannon wrote which indicates the revolutionary nature of what he was thinking. He laid out the most fundamental parts that appeared in the famous paper referenced in the OP. This paper is almost as famous, and because of the war was not published until after the more famous one. It is much shorter, and if you want to get up to speed fast it might be best to read this one first: http://nms.csail.mit.edu/spinal/shannonpaper.pdf

    I should mention that Shannon may be not the best example of how engineers are constantly dealing with trade-offs in designs. There are much simpler examples to focus on, such as: stronger structural members in vehicles add weight and reduce fuel efficiency. Invoking Shannon in this debate endeavor is like invoking Freud when reasoning with kids.

  4. I point out groovamos errors in basic math here:

    http://www.uncommondescent.com.....ent-427566

    Hey Groovamos,

    Are you still having problems understanding that the symbolic phrase

    CdT

    includes the exact differential dT. You’re making it seem you have little comprehension of basic math notation.

    Do you not understand CdT means the heat capacity multiplied by the exact differential dT? :-)

    If I inserted an extra dT as you suggested, then that eqaution would be wrong. Are you going to confess your error or will I have to keep repeating it until you recant and admit you’re the one with analytical problems since apparently you can’t discern “CdT” means “C” multiplied by “dT”!

    Can you explain to the UD readers how you made such an error? How were you proposing to amend the equation? Were you going to suggest using CdTdT? :-)

    Oh I know, you might be thinking of the Java naming convention where different words are delimited by capitalization….yeah right.

    Unforuntately, I pointed out that C is heat capacity, thus you have no excuse to say you read it as “Cd” multiplied by “T”. Not to mention this equation will be found in thermodynamic texts, and not to mention the equation was sourced from the linked MIT website.

    So are you going to fess up on your egregious error or will I have to keep reminding readers about it. :-)

    I mean it’s unbelievable you couldn’t notice the integrand that has CdT/T has the term “dT” right there to the right of “C”. How could you miss it? Do you not understand basic 1st semester calculus?

    How can you claim you didn’t see it since you supposedly scrutinized the equation closely enough to make the erroneous claim that “dT” wasn’t in the equation when it clearly was.

    How can you say that the symbolic phrase “CdT” does not include “dT”? Do you not understand basic math notation? How can you say you didn’t see the “dT” following the C. That’s awfully hard to miss. Are you feeling OK. Maybe you shouldn’t be driving if your vision is having such problems of reading omission whereby you can’t see the “dT” that is only milimeters to the right of “C” in the phrase “CdT”!

    Hey Groovamos,

    Have you been able to figure out where that “dT” is in the first equation. You know, the one you claim is missing but is still right there under you nose. This ain’t like the hunt for the God Particle you know. You just need the ability to see. If you don’t have that, then you shouldn’t be driving.

    Acknowledge, please. :-)

    and

    43

    scordovaAugust 9, 2012 at 8:25 am

    Hey Groovamos,

    Have you been able to figure out where that “dT” is in the first equation.

    Acknowledge, please.

    Still having problems admitting your mistakes eh? You might try to get your math right before attributing your errors as my errors.

  5. The “bad design” arguments have at least two major themes:

    1. An Intelligent Designer like God wouldn’t make designs that are capable of breaking down

    2. God (as the Intelligent Designer of Life) doesn’t exist because of all the cruelty and evil in the world

    This is part of the Hume theodicy argument but it is a useless argument against ID. It is only an argument against the Judeo/Christian God who is supposed to be all powerful and all good. So this argument is not against ID but a certain conception of God.

    So whoever uses this argument is trying to undermine the Judeo/Christian God and not ID which doesn’t presuppose this type of God. It is a non sequitur.

    From a different perspective, all organisms must be defective in some way or else the ecology that supports them could not survive. So good design is one that puts limitations on all the elements of an ecology or that bad design from one perspective is actual good design from another.

    The third part of theodicy argument is that evil exists which is assumed to be true and if true, then one of the other two concepts can not be true. As a personal point of view, I do not believe most of what we see in this world is really evil. There are a lot of unpleasant things but not really evil. I used to pose a question here frequently, “What is evil?” No one really answered it.

  6. With respect to the second “bad design” argument (which influence Darwin to an extent): selfishness is a key factor in the cruelty in the world today. The seeds sown decades ago by the so-called me generation have produced a society in which the majority are concerned primarily about themselves. Many will do whatever it takes to get their own way, often resulting in cruel acts. This is true not only of individuals but also of entire nations.

    The lives of fellow humans no longer seem to matter. Some people even enjoy being cruel. They find it entertaining, much like criminals who confess that they harm others just for the thrill of it. And what about the millions whose preference for films featuring violence and cruelty encourages the motion picture industry to cash in on such themes? Constant exposure to brutal acts through entertainment and the news media desensitizes many.

    While a religious person would attribute cruelty to the Devil (1 John 5:19), it’s worth pointing out that much of the cruelty today has its roots in humanity. And any behavior that is learned (see the example of media desensitization above) can be unlearned.

  7. Or, supposed “mistakes” of design often have turned out to be advantageous to an organism. It is only the hubris of man that presumes error before understanding all of the facts.

    The biggest irony of all is that such an argument against design is also an argument against Darwinian evolution. After all, isn’t the magical process suppose to weed out the bad? If it can turn a cow into a whale, can’t it reroute the pharyngeal nerve?

  8. Hey bud I contributed a link to your thread do I get a thank you for that if you get an apology for my middle aged eyes not being quick enough? Is it a deal? You interested in a compressed treatment of statistical communications just like I linked today? Seems like the bitterness just won’t let go.

  9. Groovamos has posted drivel at UD before:
    http://www.uncommondescent.com.....is-drivel/

  10. Hey groovamos “bud” you’be spewed some unchatitable snark on my threads in the past and you’re getting a taste of it.

    Data rate is the accurate term, but colloquially (though formally incorrect), bw is understood as data rate or capacity. The importance of shannon is that it demonstrates the necessity of errors + corrections.i

    Colloquial usage is nothing like the error you made and never recanted nor admitted. Worse you faulted me for something untrue. I gave you plenty of time to retract. You didn’t.

  11. To prevent further issues, I changed the phrase “bandwidth” (the colloquial notion) to “data rates”. Bandwidth probably got into popular culture because increasing bandwidth is one of several avenues to increase data rates, hence in popular culture the notion of increasing bandwidth to increase data rates got so deeply entrench in culture that the two notions became erroneously synonymous.

    Here is Wiki statement of Shannon’s theorem:

    Stated by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. Shannon’s theorem has wide-ranging applications in both communications and data storage.
    ….
    The Shannon theorem states that given a noisy channel with channel capacity C and information transmitted at a rate R, then if R < C there exist codes that allow the probability of error at the receiver to be made arbitrarily small. This means that, theoretically, it is possible to transmit information nearly without error at any rate below a limiting rate, C.

    That means as long as the actual data rate R is less than the theoretical limit C, one can construct an error correction mechanism to cleanse out the errors due to noise.

    Early modems were very slow because the modulation schemes and error correction mechanisms weren’t very sophisticated. Once the error correction got more sophisticated, more errors could be corrected, and hence more data pushed through a channel or more data stored on a memory device. The more errors the system can tolerate and correct, the more data can be pushed through.

  12. The importance of shannon is that it demonstrates the necessity of errors + corrections.i

    Early modems were very slow because the modulation schemes and error correction mechanisms weren’t very sophisticated.

    OK here we go (I still don’t know what in this thread got the OP angry at me): Error correction codes are almost a side issue in Shannon’s work. The Reed-Solomon cross-interleave code was invented in the ’70′s, and is still very popular. So no, Shannon’s work did not show the necessity of errors + correction. Errors are a given, error correction is a requirement and a basis for a system parameter.

    Also modulation schemes have not changed very much since the ’70′s, QAM remains the most popular type after several decades. One application that does not use QAM is ethernet.

    Going back to the dial up modem example, the 2400 modems in the early nineties were not slower because of unsophisticated modulation or unsophisticated error correction. (Dial-up uses QAM and DSL uses a highly “smart” segmented QAM.)

    I don’t think the OP was interested in the above link I provided to my post on statistical communications several days ago. If the OP had followed the link and read what it was about, he would have seen that the big breakthrough was the use of randomizing codes: e.g. trellis codes (which got us to 14.4 kbps) and finally turbo codes which RANDOMIZE THE BIT STREAM. This is the core of what Shannon’s work showed had to be done to optimize throughput at the Shannon limit. There was no big breakthrough in modulation or error correction that got us there. The OP would have read that such a randomized bit stream has the effect of ‘massaging’ the modulated signal into bandlimited ‘white noise’, as would be shown on a spectrum analyzer over the signal bandwidth. The higher the data rate, the wider the bandwidth of the ‘white noise’ signal and the required channel. In the above the reference to massage was intentional lest someone think I meant ‘message’

    And so we get to the core issue. Shannon’s work was not concerned with engineering tradoffs so much as it concerned with optimization: establishing the universal upper limit of information throughput for a given set of system parameters. How the engineer chooses the parameters to deal with the tradeoffs was of little concern to Shannon which is why I posted to the thread in the first place.

    Now if anyone wants another link to a quick treatment of the core of Shannon’s work written by another author in a few pages here it is: http://www.posterwall.com/blog.....1318295130

  13. Colloquial usage is nothing like the error you made and never recanted nor admitted.

    This is not about colloquial usage. Since all modern networks are running near the Shannon limit, bandwidth can be a rough benchmark for data throughput, and a term readily used by engineers. The reason it is “rough” is that without specification of signal power and noise power (which is variable, not always controllable), it cannot substitute for actual data rate. Did the OP search Shannon’s famous papers for the term “maximum bandwidth”? What is “maximum bandwidth”? To me its kind of like the term ‘maximum square footage’.

    Now so far as “error (I) made” a year ago, that should be obvious, and apparently the OP needs Kudos for catching me on it. But he needs more than that. He needs to make sure you know I post drivel. In fact he started a new thread, “Even after a year, groovamos has not recanted his drivel”:

    http://www.uncommondescent.com.....ent-461448

    Now this is not visible from the UD home page. I wonder why not. We have a real meeting of minds on these two threads. I just posted there again on the accusations. Gently enough I hope.

  14. So no, Shannon’s work did not show the necessity of errors + correction. Errors are a given, error correction is a requirement and a basis for a system parameter.

    I beg to differ, Shannon’s work showed there EXISTS in principle a possible coding scheme that can correct errors in various data rates up to the Shannon limit, hence a consequence of the theorem is that if you want faster communication you’ll have to accept more errors plus more correction. If speed is the dimension of more perfect communication, it implies one should be willing to accept more errors in the data stream and be willing to add more correction at the receiving end, a consequence of Shannon’s theorem is that accepting more errors and requiring more correction is a good route to go because there exists a coding mechanism to cleanse out the errors. The challenge is the theorem doesn’t tell you how to find it. This is contrasted with the notion of “getting right the first time”.

    What it means “Shannon demonstrated” might be better phrased as “a consequence of Shannon’s theorem”. But you seem incapable of trying to read the sense of what I wrote, even when I’m right, you find ways to misread it like “CdT”.

  15. Also modulation schemes have not changed very much since the ’70?s, QAM remains the most popular type after several decades. One application that does not use QAM is ethernet.

    Going back to the dial up modem example, the 2400 modems in the early nineties were not slower because of unsophisticated modulation or unsophisticated error correction. (Dial-up uses QAM and DSL uses a highly “smart” segmented QAM.)

    But digital data communication goes farther back than the 90′s with morse code and baseband antipodal modulation. If you considered those, you’ll understand the point I was making versus just focusing on the 90′s and beyond. Those were the problems Shannon was trying to solve because it was an open question how fast can you push data through a channel, and it is a good thing his theorem demonstrates in principle there is a coding scheme possible. Hence if speed is the criteria for perfection, you must accept more errors. If you didn’t have to accept more errors to gain speed or compactness, there would be no need for ever more sophisticated error correction schemes.

    The consequence of Shannon’s work:
    accept more errors if you can find the coding scheme that his theorems predicts in principle must exist

  16. The first official modem is listed as this one:
    http://inventors.about.com/lib.....lmodem.htm

    It used Frequency Shift Keying. They did not have back then the computational fire power and VLSI to implement more sophisticated modulation schemes.

    In 1962, the first commercial modem was manufactured – the Bell 103 by AT&T. The Bell 103 was also the first modem with full-duplex transmission, frequency-shift keying or FSK, and had a speed of 300 bits per second or 300 bauds.

    Modems were slow because of lack of the ability to implement sophisticated modulation and demodulation and error correction, contrary to what groovamos suggested by focusing on the relatively recent past. It’s irritating I have to point this out to defend my points.

Leave a Reply