Home » Evolution, Intelligent Design » Why Scientists Should NOT Dismiss Intelligent Design

Why Scientists Should NOT Dismiss Intelligent Design

MSNBC has an article titled “Why Scientists Dismiss ‘Intelligent Design’” (go here). In it, Ken Miller argues against ID, and specifically against my claim that undirected natural causes cannot generate specified complexity in biological systems. His argument focuses on the evolution of nylonase:

The nylon problem

There is a way to settle this, however, because like Behe’s irreducible complexity, the concept of specified complexity can also be tested.

“If Dembski were right, then a new gene with new information conferring a brand new function on an organism could never come into existence without a designer because a new function requires complex specified information,” Miller said.

In 1975, Japanese scientists reported the discovery of bacteria that could break down nylon, the material used to make pantyhose and parachutes. Bacteria are known to ingest all sorts of things, everything from crude oil to sulfur, so the discovery of one that could eat nylon would not have been very remarkable if not for one small detail: nylon is synthetic; it didn’t exist anywhere in nature until 1935, when it was invented by an organic chemist at the chemical company Dupont.

The discovery of nylon-eating bacteria poses a problem for ID proponents. Where did the CSI for nylonase—the actual protein that the bacteria use to break down the nylon—come from?

There are three possibilities:

**The nylonase gene was present in the bacterial genome all along.

**The CSI for nylonase was inserted into the bacteria by a Supreme Being.

**The ability to digest nylon arose spontaneously as a result of mutation. Because it allowed the bacteria to take advantage of a new resource, the ability stuck and was eventually passed on to future generations.
Apart from simply being the most reasonable explanation, there are two other reasons that most scientists prefer the last option, which is an example of Darwinian natural selection.

First, hauling around a nylonase gene before the invention of nylon is at best useless to the bacteria; at worst, it could be harmful or lethal. Secondly, the nylonase enzyme is less efficient than the precursor protein it’s believed to have developed from. Thus, if nylonase really was designed by a Supreme Being, it wasn’t done very intelligently.

The problem with this argument is that Miller fails to show that the construction/evolution of nylonase from its precursor actually requires CSI at all. As I develop the concept, CSI requires a certain threshold of complexity to be achieved (500 bits, as I argue in my book No Free Lunch). It’s not at all clear that this threshold is achieved here (certainly Miller doesn’t compute the relevant numbers). Nor is it clear that in the evolution of nylonase that anything like pure neo-Darwinism was operating. Instead, we see something much more like what James Shapiro describes as “natural genetic engineering” (go here). And how do systems that do their own genetic engineering arise? According to Shapiro, Darwinism (whether neo or otherwise) offers no insight here.

Let’s look at nylonase a bit more closely. Nylonase appears to have arisen from a frame-shift in another protein. Even so, it seems to be special in certain ways. For example, the DNA sequence that got frame-shifted is a very repetitive sequence. Yet the number of bases repeated is not a multiple of 3 (in this case, 10 bases are probably the repeating unit).

What this means is that the original protein consisted of repeats of these 10 bases, and since it is not a multiple of 3, it means that these 10 bases were translated in all three possible reading frames (the second repeat was one base offset for translation relative to the first repeat, and the next was offset one more base, etc). Moreover, none of those reading frames gave rise to stop codons. Since the 10-base repeat was translatable in any reading frame without causing any stop codons, the sequence was able to undergo an insertion which could alter the reading frame without prematurely terminating the protein.

Actually, the mutation did cause a stop codon; but the stop codon was due not to frame shift but to the sequence introduced by the inserted nucleotide. Simultaneously, the mutation introduced a start codon in a different reading frame, which now encoded an entirely new sequence of amino acids. This is the key aspect of the sequence. It had this special property that it could tolerate any frame shift due to the repetitive nature of the original DNA sequence. Normally in biology, a frame shift causes a stop codon and either truncation of the protein (due to the premature stop codon) or destruction of the abberant mRNA by the nonsense-mediated decay pathway. Nonetheless, the nylonase enzyme, once it arose, had no stop codons so it was able to make a novel, functional protein.

Most proteins cannot do this. For instance, most genes in the nematode have stop codons if they are frame-shifted. This special repetitive nature of protein-coding DNA sequences seems really rare; one biologist with whom I’ve discussed the matter has never seen another example like it. Maybe it’s more common in bacteria. Thus, contrary to Miller, the nylonase enzyme seems “pre-designed” in the sense that the original DNA sequence was preadapted for frame-shift mutations to occur without destroying the protein-coding potential of the original gene. Indeed, this protein sequence seems designed to be specifically adaptable to novel functions.

There is something very special about the nylonase host gene that isn’t true of most genes in general and gives it much greater evolvability. As an aside, the function of the original gene (before it mutated into a nylonase) appears unknown (I’d be grateful for any insight here). The original paper suggested that the host gene was unlikely to encode a functional enzyme on account of lacking the amino acids normally found in active enzymes, so maybe it played some structural role that was not critical for the cell (no mention was made whether the host gene was a duplicate).

Here is a reference to the original paper: “Birth of a unique enzyme from an alternative reading frame of the pre-existed, internally repetitious coding sequence”, Susumu Ohno, Proc. Natl. Acad. Sci. USA, Vol. 81, pp. 2421-2425, April 1984.



  • Delicious
  • Facebook
  • Reddit
  • StumbleUpon
  • Twitter
  • RSS Feed

27 Responses to Why Scientists Should NOT Dismiss Intelligent Design

  1. If you haven’t already, please send your rebuttal to both Miller and MSNBC’s editor.

    They need to be made aware of this, in the event that they do not puruse your site.

  2. more ignorance from one side of the issue. bill isn’t a “scientist” according to MSNBC. his ideas would be “the death of science”.

    and this sentence that is dripping with elitist arrogance:

    “Or it could be sheer explanatory power, which was what allowed evolution to become a widely accepted theory with no serious detractors among reputable scientists.”

    so bill isn’t a “reputable scientist” and his ideas and the ideas of others who share similiar views aren’t “serious”- heck, bill and others like him aren’t “serious” themselves.

    i’m so sick and tired of hearing that those who question something like this aren’t really true scientists and they’re not “reputable” and such. it makes you sick to read this everyday, as if msnbc.com writer is the end-all and be-all of truth.

  3. i also like this part:

    “Yet no true examples of irreducible complexity have ever been found. The concept is rejected by the majority of the scientific community.”

    so, that’s that…this writer proclaims that no true example of IC has EVER been found. and how do we know that? well, because “the majority of the scientific community” rejects it! and why do they reject it? could it be that evolution from one form to another has become a dogma among most scientists? surely not, considering the writer proclaims that it’s fact that no example has “ever been found,” and it must end there. then, cue ken miller proclaiming behe wrong (no defense is offered of behe’s ideas from those who support that his ideas AREN’T wrong). and they say that neo-darwinists are open minded and interested in the truth! (ha!)

    and bill, according to the article (as i’m sure you saw) it says DI is a “christian think tank”- is that true? i’ve never heard anyone label it this way before, and i don’t think it’s actually the case- many christians are fellows, i know, but some of the DI fellows aren’t religious at all from what i’ve read. could you shed some light on this aspect?

  4. Their tactic is constant repititive misrepresentation. If they keep repeating ad infinitum the same lies, maybe the public will eventually buy into it.

  5. i just cant stop wondering how in the world the majority of scientists got to this point where they seem to not want to follow the evidence. its almost as if ken miller and others like him would attack ID or anything else that questions his views no matter what evidence was involved. its like a knee jerk reaction today- someone questions bio. evo. and you go on the attack and misrepresent them and their ideas, paint them as anti-science, etc. these are the same scientists that welcome views on wormholes, time travel, alien life in space, and other such ideas. intelligent design and purpose in the universe- NO. wormholes and time travel- lets look into it and welcome the ideas!

    what has happened to science in general?

  6. I think there are numerous reasons (all of which stem from a root cause: human pride). You have people like Ken Miller who, I believe, desperately wants to gain the approval of his peers and gurus. And then you have people like Dawkins who worship at the altar of secular humanism and who’s pride simply won’t allow for the possibility of Divinity and who has bought into a modern western prejudice against the supernatural / metaphysical. Some have associated their personal value and identity with Darwinism for so long, that they would be absolutely shattered if their “religion” was debunked. Hence the emotion we see when the address this issue.

  7. the = they.

  8. The Ken Miller rant was just boring. It reminded me of a liberal whining over and over again about “paying your fair share”. The more this response gets recycled, the more numb people will get to it…


  9. kind of related to the topic of bias, science and its major dogmatic thinking that all things must be natural in origin and anything outside isnt true science (because we said so!), etc.

    Michael Chrichton:

    “I regard consensus science as an extremely pernicious development that ought to be stopped cold in its tracks. Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled. Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you’re being had.
    Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus.
    There is no such thing as consensus science. If it’s consensus, it isn’t science. If it’s science, it isn’t consensus. Period.”

  10. 10

    Dr. Dembski, I have a serious question (actually a serious of questions). I’m not trying to attack you or catch you in some trap or something–I very honestly want to know the answer to this.

    Is the 500 bits threshold derived, or is it arbitrary?

    If it is derived, what is the source for that figure? Is there a statistical significance, such as a gap in the distribution, around the 500-bit mark?

    If it is arbitrary, is an arbitrary value for a CSI threshold strong enough to support a tenable distinction? Why or why not?

    [Read my paper titled "Specification: The Pattern That Signifies Intelligence" at http://www.designinference.com. It addresses your question. --WmAD]

  11. earthenvesselmz wrote: “Is the 500 bits threshold derived, or is it arbitrary?”

    I want to know that too. I’ve read a bit of Dembski, mostly online material, but I don’t remember that limit but I also haven’t read “No Free Lunch” yet.

    Also, looking at the James Shapiro “natural genetic engineering” site — the first thing that came to my mind was a “Turing Machine.” (need a link for that?)


    It looks like Darwinism isn’t your real metaphysical foe any more Dr. Dembski, but rather “Turingism” is. If you want to do battle with “metaphysical naturalism,” then your real problem is the naturalization of intelligence.

    By the way, I’m a Turingist and a Darwinist.

    [Even Turing does not allow for a full naturalization of intelligence, since the programs that a Universal Turing Machine runs are not explainable within the Turing formalism. But you are right that the naturalization of intelligence is my main target. --WmAD]

  12. If anyone’s interested, I have my own discussion of Nylonase here:


    Seems to be a case of natural genetic engineering to me, which is not a Darwinian process. In order to produce a change that big in that short of time frame would require that the mutation rate be so large that the organism would quickly fall to error catastrophe. Instead, it seems that the response is generated from the environment itself, and that the organism has manufactured a new gene specifically for this purpose. Neo-Darwinists make a big hoopla about it being non-deterministic, but that’s really irrelevant. It still must be a directed process.

  13. ive read two articles in the past month where they did studies that showed bacteria that gain resistance to antibotics already have the info. preexisting in their genes. theyve actually found ancient baterica that theyve been able to study and found that these strains have the same resistance genes, and they HAD to have been preexisting because these bacteria were millions of yrs old (according to the scientists who studied them) and unless they had antibiotics millions of yrs ago!

    not familiar with all the science, but when i first hear this nylon bug thing, thats the first thing that came to my mind. its merely adaptation to a new thing in the environment which is hardly the same as evolution from one form to another. plus, if this was evolution in that sense, wouldnt it worry neo-darwinists that it happened to suddenly? i thought change was gradual…new traits (being able to digest nylon) would have taken more time to come about, no? mutations, even in populations with very high rates wouldnt change things that fast it seems!

  14. There is an interesting admission here:

    “There is a way to settle this, however, because like Behe’s irreducible complexity, the concept of specified complexity can also be tested.”

    Miller is clear that ID, whether expounded by Behe or Dembski is a testable scientific hypothesis. As much of the ‘scientific community’ that rejects ID of which Miller speaks, does so on the opposite basis, arguing that ID is unscientific because it is unfalsifialbe, and untestable, it is clear that they are doing so in contradiction to each other. Many of the same scientists who argue that ID is untestable will quote Miller in their ‘refutations’.

    It seems to me that there is a strange logic here – any argument is scientific, even contractory ones, so long as they denounce ID. Darwinism wants to have its cake and eat it. A negative argument perhaps, but as a non biologist, one the factors that has pushed me towards ID has been the poverty of some (if not most) of the arguments ranged against it. There is a overwhelming sense that ‘anything goes’, which surely is the hallmark of dogma.

    Meanwhile, I think Miller is actually doing ID an unintentional service, by helping establish ID as a legitimate science, by proposing the kind of ways in which it might be ‘tested’. Whether he is right or wrong can be debated (and thankfully for ID, he appears to have a knack for choosing shakey ground upon which to fight), but the debate is (finally) about the science.


  15. Hey Ben, evidently Miller is arguing with himself. Here’s a quote from an AP article today (“he” refers to Miller):

    “On the other hand, he said, “Intelligent design is not a testable theory in any sense and as such it is not accepted by the scientific community.”"

  16. For those too lazy to read Dembski’s work… ;)

    “Is the 500 bits threshold derived, or is it arbitrary?”

    The 500 information bits he mentioned were derived from his Univeral Probability Bound of 10e-150 using:

    Information(Event) = -log2Probability(Event) or I(E) = -log2P(E)

    If I remember correctly his UPB is based upon the maximum possible physical reactions in the universe (# of particles, duration of the universe, etc). A probability event that exceeds the UPB is considered by statisticians to be one in which chance is precluded. In fact, the mathematician Borel actually suggested a UPB of 10e-50 so Dembski is even “nicer” to naturalistic philosophers since his UPB of 10e-150 gives them more wiggle room.

    I’m sure he’ll correct me if I’m wrong…

    Anyway, Miller must have not read Dembski’s work carefully since if he did he’d realize that CSI can only be applied if the event occurring exceeds the Universal Probability Bound (UPB). In short, Miller’s example is good evidence for micro-evolution but it does not meet the specifications for CSI.

  17. Out of curiousity, what IS the probability for the event where bacteria nylonase enzyme evolved?

  18. Gumpngreen:



    “Thus, if only 6 of these 47 mutations were essential for the evolution, the probability of achieving it in 30 years is about 3 x 10^35. So, if the evolution could not be random, then it would have to be nonrandom, and as I have suggested in my book, they would be triggered by the environment. That is, the capability is built into the bacterium and the environment triggers the mutations.”

    That’s the probability for it occurring in 30 years. If I recall correctly, this occurred in 9 DAYS! Another article which may be of interest to you is this one:


  19. johnnyb

    I read the article on your blog about nylon eating adaptation. Great work. I often wonder myself if the intelligence behind evolution is part and parcel of the cell itself. Amazing computational tasks (predicting how proteins will fold, for example) can theoretically be accomplished with trivially small bits of quantum computing hardware that would easily fit in bacterial DNA. This doesn’t answer the question of how the DNA quantum computer came to exist in the first place but it explains a lot of what happened later.

    Anyhow, it dawned on my that since the nylon-eating metamorphosis is so well characterized (I wasn’t aware it was so well detailed) it would be an ideal candidate to run through the explanatory filter math to see whether or not a design inference is warranted.

  20. Well, unless I’m misreading Dembski’s work, if indeed the nylonase enzyme came about by self-modification the CSI would be contained within the original information and the mechanism for self-modification and not the event itself that generated the enzyme. In short, chance entered these organism’s developmental pathways and modified its already existing CSI through “inheritance with modification”.

  21. Sounds to me like the enzymes (plural!) that enable digestion of nylon are going to be found to warrant a design inference. The source of the design is another question. If it’s an intelligent mechanism residing in the cell itself that’s fine. ID will get a gold star by detecting design vs. random action in the creation of the enzymes. It will prove its merit by initiating a search for the source of the design whereas standard evolutionary theory would just write it off as a random event and not look further.

  22. Flipping through Dembski’s books*…

    He has three categories that account for CSI inherent in biological systems: (1) Inheritance with modification (2) Selection (3) Infusion

    The first one appears to be the relevant one for this topic.

    “Inheritance is thus merely a conduit for already existing information.
    By modification I mean all the instances where chance enters an organism’s developmental pathway and modifies its CSI. Modification includes–to name but a few–point mutations, base deletions, genetic crossover, transpositions and recombination generally.”

    Personally I think–assuming they exist–that self-modification through built-in functions that work based upon detecting certain environmental factors should have its own category and not be lumped in with “inheritance with modification”. Especially if it turns out the Latent Library idea has any merit to it (the idea that much of the non-coding DNA that is highly conserved in species is some kind of programmed change engine with a latent library of genus plans and other feature mods).

    *Collect them all! ;)

    On a humorous note, if you want to quote mine Dembski:

    “I look at a blade of grass and it speaks to me. In the light of the sun, it tells me that it is green.”

    I was reading that the other day and it just struck me as funny if you think of it out of context.

  23. [...] I receive a mention next to one of the slides — apparently the emergence of nylonase is supposed to provide empirical disconfirmation of my theoretical work on specified complexity (Miller has been taking this line for years). For my response about nylonase, which the critics never cite, go here. [...]

  24. [...] mutations I go for my own intepretation of the facts in contrast of what evolution claim, Why Scientists Should NOT Dismiss Intelligent Design | Uncommon Descent 01/07/30 – ICBP 2000 __________________ "I took my stand in the midst of the world, and in [...]

  25. [...] no reason to cruise that a nylon-eating ability requires any novel information. As William Dembski explains: Nylonase appears to have arisen from a frame-shift in another protein. Even so, it seems to be [...]

  26. [...] William Dembski writes that: the nylonase enzyme seems “pre-designed” in the sense that the original DNA sequence was preadapted for frame-shift mutations to occur without destroying the protein-coding potential of the original gene. Indeed, this protein sequence seems designed to be specifically adaptable to novel functions. [...]

Leave a Reply