Home » Intelligent Design » A Realistic Computational Simulation of Random Mutation Filtered by Natural Selection in Biology

A Realistic Computational Simulation of Random Mutation Filtered by Natural Selection in Biology

All computational evolutionary algorithms artificially isolate the effects of random mutation on the underlying machinery: the CPU instruction set, operating system, and algorithmic processes responsible for the replication process.

If the blind-watchmaker thesis is correct for biological evolution, all of these artificial constraints must be eliminated. Every aspect of the simulation, both hardware and software, must be subject to random errors.

Of course, this would result in immediate disaster and the extinction of the CPU, OS, simulation program, and the programmer, who would never get funding for further realistic simulation experiments.

  • Delicious
  • Facebook
  • Reddit
  • StumbleUpon
  • Twitter
  • RSS Feed

30 Responses to A Realistic Computational Simulation of Random Mutation Filtered by Natural Selection in Biology

  1. I think it is very possible to come up with specified complexity using “natural” or rather simulated natural selection in computers. I know it’s a kind of guided evolution, but still possible.

    However whether or not the intemediate forms are always viable in nature, is questionable. You require a degree of uniformity for gradual. naturally selected change to occur. The existance of seemingly unreachable “critical points” in the evolution of any complex structure would in my mind throw the whole model in doubt.

    There is also the matter of whether nature is self-regulating – in other works, programed to be resistant to change. For example, DNA can repair itself, thus preventing mutation – why such a feature would originate via mutation itself is a mystery.

  2. 2

    “Every aspect of the simulation, both hardware and software, must be subject to random errors.”

    This makes *exactly* as much sense as requiring that a supercomputer simulating a hurricane blow over tables and chairs, drench the operator, and cause widespread power outages.

    You have forgotten what Turing demonstrated: the power of computation lies in the independence of computational algorithms from the physical substrate upon which the computation is instantiated.

  3. One engineer wrote that computional evolutionary algorithms was nothing but a fancy title for “trial and error” programs. trial and error is nothing new.

  4. This makes *exactly* as much sense as requiring that a supercomputer simulating a hurricane blow over tables and chairs, drench the operator, and cause widespread power outages.

    I think it’s more along the lines of how much sense it makes to have a big ole meteor come crashing down and smash your simulator to smithereens like happens occasionally during real evolution.

    But that’s just me. I’m a stickler for realism and Darwinians at their Conway’s Game of Life v2.0 don’t really bother much with modeling of reality.

  5. I could see this as a legitimate experiment, but we must remember that not all organisms share the same ‘underlying machinery’, as you call it. Surely if there was but one copy, as in a computer, a single harmful mutation could bring down the entire system. This, of course, is not the case in biology. If we made a copy of said machinery for each simulated organism, and allowed those copies to undergo mutation, I don’t think the results would be as disasterous as you claim.

  6. Reciprocating Bill wrote:

    This makes *exactly* as much sense as requiring that a supercomputer simulating a hurricane blow over tables and chairs, drench the operator, and cause widespread power outages.

    Absolutely right. Furthermore, allowing random errors anywhere in the simulation would be tantamount to allowing the laws of physics, geography, climate, etc., to change instantaneously, which of course does not happen in reality.

    The selective environment is not random, and so a realistic simulation of a selective environment cannot be random either.

  7. Teaching computer science students from the undergraduate to the doctoral level, I encountered quite a few who were excellent programmers, but who could not begin to comprehend the notion of a model. The concept is simply too abstract for some people. They never catch on to it.

    A simulation model of evolution executes on a computer, but the computer, its operating system, and the runtime system of the programming language in which the simulation was written are not part of the model. Their function is to execute precisely the evolutionary model specified by the programmer. Any environmental cataclysms are simulated by the program itself, and are not a matter of failure of the computer hardware or the software operating system. That is, the environment is simulated by a properly functioning computer. The computer itself is not the simulated environment.

    I say categorically, as someone who has worked in evolutionary computation for 15 years, that Gil does not understand what he is talking about. This is not to say that he is trying to mislead anyone. It is simply clear that he has never grasped the nature of a simulation model. His comments reflect the sort of concrete thinking I have tried to help many students grow beyond, often without success.

  8. DaveScot wrote:

    I think it’s more along the lines of how much sense it makes to have a big ole meteor come crashing down and smash your simulator to smithereens like happens occasionally during real evolution.

    Dave is making the same mistake as Gil: confusing the simulator with the thing being simulated (i.e., the model).

    To simulate a meteor strike, you don’t hit your computer with a hot rock moving at thousands of miles per hour. You don’t even scramble your operating system or rewire the chips on your motherboard. Instead, you simply write some software that models the meteor strike by altering the appropriate variables in the simulated environment: the simulated temperature rises, a simulated tsunami is generated (assuming the meteor hits the ocean), simulated water boils off, a simulated dust and vapor plume rises and then spreads, simulated sunlight is blocked by the simulated dust, etc.

    The simulator hasn’t “been blown to smithereens” at all. The computer and its software continue to run correctly the entire time. It’s the model, not the simulator, that is disrupted by a simulated meteor strike.

    Another example. As a fellow engineer, Dave is aware that we use computers to simulate other computers all the time. In particular, we use existing computers to simulate computers that have not yet been built. This allows us to find and correct errors in the design before going through the trouble and expense of building an actual, physical prototype.

    Not only can one computer simulate another; it can also simulate manufacturing defects in the other computer. For example, it can simulate the effects of a logic circuit that is unintentionally shorted to ground. By simulating such defects, engineers can guarantee that the diagnostic software they write will be able to detect these defects when they occur on the production line. The defective computer can then either be fixed or discarded.

    The point is this: to simulate a shorted logic circuit, you don’t stick a piece of bare wire into the computer that’s running the simulation. Instead, you write a piece of software that simulates the effect of a short circuit by altering the behavior of the simulated computer. The “real” computer and its software continue to run normally the entire time.

  9. Sounds like GilDodgen want to simulate natural disasters to an extreme degree where the environment itself becomes incapable of supporting life… Even in the case of simulating worldwide catastrophe I don’t see the need for the simulator program to break down to the point that it affects the computer itself. Personally I think that a breakdown of the simulation (aka death of all simulated life WITHIN the model) would be more than enough. Of course, this would require a higher level of abstraction in comparison to most simulations which “protect” certain functionality. Obviously the physical laws being simulated should never break down but why should a replication engine be artificially sheltered from from the adverse effects of mutations?

    Anyway, why even bring this up? It’s not like the highly constrained programs written so far even remotely resemble biological reality.

  10. Patrick wrote:

    Sounds like GilDodgen want to simulate natural disasters to an extreme degree where the environment itself becomes incapable of supporting life…

    If the environment becomes incapable of supporting life, life dies. That is true whether life evolved or was designed.

    What Gil wants to show is quite different. He wants to show that evolutionary processes are impotent because evolutionary simulations can be derailed by disrupting the hardware and software they run on.

    His error is in thinking that if he disrupts the simulator and it doesn’t recover, that the model itself has proven impotent.

    As previous comments by Bill, Tom, and me show, the simulator and the model are two different things. Gil and Dave are conflating them.

  11. I could be wrong but I think his primary point is that certain functionality of the model shouldn’t be protected and he’s making an extreme example (too extreme and vague a point in my opinion)…perhaps in an attempt to be humorous? Gil has discussed such programs well before all of you even came onto UD (unless you’ve snuck in with new pseudonyms) and I know he comprehends the difference.

    Ah, well. I’ll let him respond.

    EDIT: Gil, would the reason you want the CPU instruction set and OS to be affected is becuase you feel the limitations of pseduo-random generators running on deterministic computers inherently limit the realism of the simulation? If that’s the case and you weren’t simply joking then I would think that using a quantum computer would be a much better solution than relying on system failure. ;)

  12. Gil is perfectly correct. In nature simulator and model aren’t two different things. In nature simply “simulator” and “model” don’t exist because there is the biological reality only tout court.

    And organisms really contain the “CPU instruction set, operating system, and algorithmic processes responsible for the replication process”. All this hardware/software stuff is under the law of entropy. So Gil is right when saying this isn’t error-free. Evolutionists cannot say that some information processing machinery is granted fault tolerant, as when they are playing with their evolutionary algorithms (which, by the way, are DESIGNED).

  13. “Every aspect of the simulation, both hardware and software, must be subject to random errors.”

    It seems the basic problem here is that if the entire poulation is on the same computer then altering the CPU etc would be equivalent to altering the environment, or even the physical constants.

  14. 14

    The programs for ontogeny and phylogeny were written eons ago by the BFL(s) [big front loader(s) in the sky]. If anyone thinks they are going to simulate those programs with a computer, they belong in a rubber room. Look at what happened to Avida. What a joke that was and what has replaced it on the gag list? Nothing of value I say and I have no idea about what is the current rage. I think the whole fiction died on the vine didn’t it?

    Gag is a wonderful word with two meanings, a joke and something you do before throwing up, both very appropriate to the subject of computer simulated evolution, especially as presented by the worshippers of the Great God Chance with which this forum is so inordinately blessed.

    When we figure out the program for ontogeny we will at the same time learn about the program for phylogeny. They are intimately related and only the ontogeny program remains extant.

    A large fraction of the developmental program has already been expressed in the cytoplasm of the egg prior to fertilization. The diploid oocyte nucleus is incredibly active during all of oogenesis. Reasoning by analogy, this provides one more element of support for the Prescribed Evolutionary Hypothesis.

    “Evolution is in a great measure an unfolding of pre-existing rudiments.”
    Leo Berg, Nomogenesis, page 406

    Substitute was for is and I will be in full agreement. One may also substitute embryogenesis for evolution in Berg’s cogent summary.

    It is hard to believe isn’t it?

    I love it so!

    “A past evolution is undeniable, a present evolution undemonstrable.”
    John A. Davison

  15. Karl

    You’re wrong. The simulator is comparable to the environment. Gil’s proposal is a little radical as many random mutations of the hardware/os etcetera will bring the whole simulation to a screeching halt never to start again. However, there should periodically be wholesale disruption of the environment that wipes out a large fraction of the life therein to simulate what happens in the real world.

    But as I said, these simulations aren’t even attempting to model the real world. They’re not based on the laws of physics and chemistry. They’re based on a make believe world of odd little digital automatons. They’re Conway’s Game of Life v2.0.

    Wake me up when it becomes more than a game and really models the real world.

    zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz…

  16. Karl

    In recent decades we began simulating chips before the first mask is made but I never heard of simulating whole computers. My experience is limited to things as complex as Intel 80×86 CPUs which are pretty complex but I suppose maybe not the most complex hardware simulations. The simulations always miss things too but you usually get something that, as we say, wiggles.

  17. 17

    Consider this: One may run an evolutionary simulation on a virtual Apple II, emulated (simulated) by a PC, emulated by a Mac running Virtual PC – and so on ad infinitum, substituting at each level any computational device powerful enough to be a Turing machine (given enough time and tape).

    At what level are you going to introduce the random hardware failures? All of them? Into the Apple II emulator alone? Just at the bottom?

    Answer: NOWHERE, as such failures are not part of the computed evolutionary model.

    Which is not to say that we couldn’t simulate Gil’s suggested experiment!

  18. Gil: “Of course, this would result in immediate disaster and the extinction of the CPU, OS, simulation program, and the programmer, who would never get funding for further realistic simulation experiments.”

    Depends on at what level of nature you are simulating.

    DaveScot: “Wake me up when it becomes more than a game and really models the real world.”

    Simulations will become interesting to me if they can come up with self-replicating entities, with a DNA-like system, that have 1% of the sort of complexity and function of the simplest bacterium, all based on a “few simple” fitness algorithms.

  19. Quotemining: From http://www.physorg.com/news78675847.html

    “Error correction coding is a fundamental process that underlies all of information science…,”

    Go figure that information integrity requires error correction coding. Why can’t QM scientist utilize RM&NS?

    What other reference data and regulation may be required? Thinking…

    Oh, hey! Another wrasscally wabbit? Gene Control by Large Noncoding RNAs: http://stke.sciencemag.org/cgi.....552006pe40
    Gotta love non-coding organogenesis capability. “Evf-2 is the first example of lncRNA directly involved in organogenesis in vertebrates.”

    And another recent article on peering into the wabbit hole: http://www.physorg.com/news78676185.html, With record resolution and sensitivity, tool images how life organizes in a cell membrane. In the journal Science today.

    And there seems to be a post missing from last night of mine. Or maybe it went down the wabbit hole too.

    Well, well, non-coding RNA directly involved, it appears the simulation has more to include.

    I am beginning to think this wabbit has as many dwelling places as G_d.

  20. 20

    Gil,

    Scientists should be aware of all possible sources of error in an experiment. If the OS allowed memory leaks, or the hardware was located in Chernobyl, these could be possible sources of randomness that could affect a simulation. Any scientist who thought they were significant to the results of the simulation should report them.

    Unfortunately, your if-then is wildly off the mark, as other commenters have noted.

  21. 21

    DS:
    Wake me up when it becomes more than a game and really models the real world.

    Simulating physics and chemistry isn’t really the point of this kind of simulation. (There are of course, other simulations for which that IS the point, but we shouldn’t conflate them.) For many experiments the result is simply descriptive and the description is basically statistical. The big question is whether “life as we know it” exhibits the same statistical behavior. IMHO, this should be a very interesting set of questions to ID theorists, and unrelated to the level of abstraction in the model. If I were an ID theorist, my research program would center around whether some aspect of life is or is not statistically similar to some class of evolutionary algorithms.

  22. these simulations aren’t even attempting to model the real world. They’re not based on the laws of physics and chemistry… Wake me up when it becomes more than a game and really models the real world.

    As David said, “Simulating physics and chemistry isn’t really the point of this kind of simulation.” From the standpoint of real-world evolution (if that’s what we decide we want to model), the point would be to show that systems can be organized by the unknowing forces of RM+NS, not perfectly simulate the real world. Besides, the idea of actually simulating the real-world is outside of our computational power and will be for a very long time. Just look at the processing power that’s being put into the protein-folding problem (google “Folding at Home”). We don’t have the kind of processing power needed to calculate solutions to that in a reasonable time. A real-world evolution simulation would need to simulate the billions of protein-foldings in each of the billions of organisms for billions of years. Like I said – a real-world model is far outside our computing power.

  23. David vun Kannon,

    I am curious, what do you think of the quote from Physorg above, “Error correction coding is a fundamental process that underlies all of information science…,”

    The only question remaining is do you believe DNA, RNA, etc., represent information. We already know there are observed error corrections mechanisms within the cell.

    Information Science is a cognitive field of reason acted upon by intelligent agency. And what is very interesting of the one statement is – “Error correction… underlies all…

    Another words, without error correction, information science does not progress, it breaks down, it is crucial to maintaining data integrity. This supports what we observe in the cell for conservation.

    One can hardly allow randomness to enter the underlying contructs of error processing, or data is lost forever and information retrieval or duplication is potentially lost forever, contributing to catastrophic consequences or in life forms – extinction.

    Populations genetics and drift, and the rudimentary simulations of natural selection hardly take these issues into account and therefore trivialize the process and are misleading in conjecture.

    The RNA non-coding regions I quoted as well lead more evidence to the contrary of the simple-mindedness of the simulation.

  24. 24

    M7 – I agree with the quoted sentence. I’m not sure what your point is. ECCs are important because random errors _do_ happen. A simulation without random perturbations wouldn’t be anything like the real world.

    There are situations in which turning off random changes might be useful to explore a specific point. For example, in David Goldberg’s introductory text on GAs, Genetic Algorithms in Search, Optimization and Machine Learning, he builds up a simple GA without random mutation at first, to demonstrate that crossover is responsible for much of the power of the algorithm. But this is only for educational purposes.

    So I can’t agree with your statement on population genetics and “rudimentary” simulations.

  25. the point would be to show that systems can be organized by the unknowing forces of RM+NS

    I don’t think that anyone would debate that Intelligence+RM+NS can organize systems in tightly constrained environments under certain artificial conditions of replication, variation, and selection. The question is what can RM+NS do under the much broader constraints of nature without intelligence being involved.

  26. David vun Kannon,

    “A simulation without random perturbations wouldn’t be anything like the real world”

    That is not what I am saying… Obviously you must account for real world. What I’m stating is the simulations do not account for enough of the real world. It is shielded and therefore limited.

    I do not disagree with random perturbations effecting life forms, such as radiation. But I do disagree with the extent of the claim for purposes of macro-evolution unfolding novel structures or entire new life forms by gradual change over time, or in great leaps without prior planning of intelligent agency.

    Error correction goes against the claim of unintelligent formation of life. That is why I quote-mined the Information Sciences statement. It shows just how critical Error Correction is to intelligent applications. I do not believe EC arises randomly, from mutations, atmospheric perturbations or from high temperature vents in the ocean.

    I believe Error Correction subsist from intelligence.

    I believe what we observe in life is intricate code, a combination of both intelligent planning for random mutations in error correction, and in reaction to environmental pressures within known boundaries.

    And to expand on the evolutionary programs themselves I do not think trivial, as in unintelligent, of course not. But I do think them trivial as small layers, being applied as solutions to all diversity as an answer for Darwinist. And I do oppose such simulations being cited as evidence for Evolution if it includes macro-evolution. And I do believe there must be a core instruction set which is submitted to random mutations that effects the top layers.

    Otherwise it is not a valid simulation to cite as evidence for ToE.

  27. 27

    I give up.

  28. Don’t give up, John. Why not write a book like Dr. Bill did?

    I am trying to read and understand what you wrote, from the links here. You use terminology that I will have to research to understand what you mean; Dembski’s book The Design Revolution explains his terminology; I will comment on that thread.

    (Hey, Human “Intelligent design”; the key to Word Press “fancy fonts” is the little ” Eureka!

    Only thing I don’t know how to do is insert a hyperlink as text that opens in a new window; however, maybe only moderators can do that, and it’s not important.

  29. Oh, it didn’t print the characther on the , key. Shift plus comma, end command with / before the shift plus ,

  30. 30

    P. Phillips

    My Manifesto and all my papers were written for consumption by those with an undergraduate level of understanding of genetics and cytology. I had no trouble getting my junior and senior students to undertand my papers. I am sure they didn’t all agree with me though because by the time I got them they had been well brainwashed by their previous exposure to the usual Darwinian pablum.

    I am confident that if I could control the first three books each student majoring in biology would read that the Darwinian fairy tale would disappear never to be heard from again.

    I don’t want to write a book because nobody takes them seriously anyway. Also by not writing a book ,I won’t have to worry about making a fool out of myself as Gould, Mayr, Provine and Dawkins have all done to such perfection, especially Dawkins.

    What I want is to find a publisher for my Manifesto and my published papers which have survived peer review. Even that is not really necessary. Mendel wrote no books and only thirteen papers only three of which were biological.

    The main thing is to die before you are recognized. I am working on that.

    “You can lead a man to the literature but you can’t make him read it.”
    John A. Davison

    ” I give you books, I give you books and all you do is eat the covers off them.”
    ibid

    “Study Nature not books.”
    Louis Agassiz

    “A past evolution is undeniable, a present evolution undemonstrable.”
    John A. Davison

Leave a Reply