Uncommon Descent Serving The Intelligent Design Community

Neo-Darwinism Impeding Research… Again

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Remember the dark days of vestigal organs? You know, back when there was a list of 180 vestigal organs? Or remember the days of junk DNA – when repetitive DNA, large regions of non-protein-coding DNA, and all sorts of mobile DNA were assumed to be non-functional simply because the investigators had assumed Darwinism rather than design?

And there’s lots more DNA that doesn’t even deserve the name pseudogene. It, too, is derived by duplication, but not duplication of functional genes. It consists of multiple copies of junk, “tandem repeats”, and other nonsense which may be useful for forensic detectives but which doesn’t seem to be used in the body itself. Once again, creationists might spend some earnest time speculating on why the Creator should bother to litter genomes with untranslated pseudogenes and junk tandem repeat DNA. … Can we measure the information capacity of that portion of the genome which is actually used? We can at least estimate it. In the case of the human genome it is about 2% – considerably less than the proportion of my hard disc that I have ever used since I bought it. [Copied from Research Intelligent Design which cites: Richard Dawkins (1998) “The Information Challenge.” the skeptic. 18,4. Autumn 1998.]

Well, it seems that those people who “spent earnest time speculating on why the Creator should bother to litter genomes with untranslated pseudogenes and junk tandem repeat DNA” have been the real winners in the past (and likely upcoming) decade of genome research.

In any case, it seems despite the repeatedly failed efforts to assign vestigality to a range of structures, some people keep pursuing the case.

What can be more innocuous than gene counting? Well, it seems a set of researchers want to revise downward the number of genes in the human genome. I’m not big into counting genes, especially as regulatory regions (you know – “Junk DNA”) seem to be as important as the genes themselves. However, what is interesting is the method these people are using to determine that an open reading frame is not a gene:

Scientists on the hunt for typical genes… have traditionally set their sights on so-called open reading frames… This method produced the most recent gene count of roughly 25,000, but the number came under scrutiny after the 2002 publication of the mouse genome revealed that many human genes lacked mouse counterparts and vice versa. Such a discrepancy seemed suspicious in part because evolution tends to preserve gene sequences — genes, by virtue of the proteins they encode, usually serve crucial biological roles….

To distinguish such misidentified genes from true ones, the research team… developed a method that takes advantage of another hallmark of protein-coding genes: conservation by evolution. The researchers considered genes to be valid if and only if similar sequences could be found in other mammals – namely, mouse and dog

So, the reason that a given gene is suspected of not really being a gene is not because of an empirical analysis of the gene itself, but rather because it doesn’t fly with evolutionary theory!

Now, of course, they mention other possibilities:

the genes could be unique among primates, new inventions that appeared after the divergence of mouse and dog ancestors from primate ancestors. Alternatively, the genes could have been more ancient creations — present in a common mammalian ancestor — that were lost in mouse and dog lineages yet retained in humans.

And then we get:

If either of these possibilities were true, then the orphan genes should appear in other primate genomes, in addition to our own. To explore this, the researchers compared the orphan sequences to the DNA of two primate cousins, chimpanzees and macaques. After careful genomic comparisons, the orphan genes were found to be true to their name — they were absent from both primate genomes. This evidence strengthened the case for stripping these orphans of the title, “gene.”

So again, not additional empirical evidence about the structure/function of the gene itself, just more talk about evolution. If there is no evidence that it evolved, it can’t be a gene! This is yet another way that Darwinism is impeding research.

So, how many genes do they propose removing from the catalogs based on Darwinism? 1? 2? 10? 100? No, it turns out they want to remove 5,000. And not only that, “this work provides a set of rules for evaluating any future proposed additions to the human gene catalog.” Oh great. That’s just what we need – Darwinism to be the official rule book for analyzing the genome.

And of course, no research on Darwinism would be complete without tagging it with a little circular reasoning at the end:

the research reveals that little invention of genes has occurred since mammalian ancestors diverged from the non-mammalian lineage.

Let’s see, we’ll drop 5,000 reading frames from the gene list because they don’t match our evolutionary expectations (they are too innovative), and then come to the conclusion that there hasn’t been any innovation in mammals.

Comments
Genetics and biology are is not the only areas of science where research is being impeded by a blind faith neo-Darwinism. It occurs in other areas of the soft sciences as well, such as neuroscience and psychology. I recently stumbled on some deceptive practices that have been happening in the field of neuroscience on the basis of Darwinian evolution: Neuroscience, Pseudoscience and the Curse of Darwinism.Mapou
January 26, 2008
January
01
Jan
26
26
2008
06:08 PM
6
06
08
PM
PDT
johnnyb #43: 'But note - they are not providing additional scrutiny for _all_ genes, just additional scrutiny for genes that shouldn’t be there by evolutionary presuppositions.' That is not true. All gene candidates have to pass the same hurdles: Presence of a start and a stop codon, length of ORF > ~300, sequence complexity, GC-content, probably a reasonable exon/intron structure etc. ... and in addition: similarity to known genes in other mammals or in humans, average length, literature citations. ~18.000 gene candidates get across all the hurdles, ~1200 fail at the last ones. Only then the authors ask for additional evidence for the protein coding function of the ~1200 ORFans that stumbled on some of the hurdles. I think that is fair. It is like in some of the olympic swimming contests, when you have four qualifying runs and the two best swimmers from each run make it to the finals but in addition two guys who placed only third in their runs can still get into the final (and have a chance to win!) if they are the two fastest third place guys of all the four runs. 'The lack of study by other researchers isn’t evidence.' Well, it is not conclusive. However, it is striking that apparently most of the 18.000 genes passing the other tests showed up in some other study (it is too bad that the authors did not provide the actual number) and of the ~1200 questionables only 12 did. And this is after maybe 25 years of extensive studies using e. g. expressed sequence tags and microarray studies but also a lot of normal biochemistry (for a sinicficantly longer time) in lots of different HUMAN tissues and cell cultures under lots of different conditions (e.g. drug treatments, different cancers, knock-down of other genes etc. ...). And most of these guys apparently never were suspicious enough to peak someones interest. As a side note: since these gene candidates were on the older gene lists they were probably represented on microarrays designed for human gene expression studies. And in these genome wide studies researchers tend to look for anything that is conspicious, e.g. changes expression levels or is expressed highly specific in a certain cell type or maybe in a cells (as an improtant housekeeping gene). Basically their literature search is just a proxy for doing an actual expression or other genome wide study for all of the gene candidates. 'The lack of research by scientists on a particular gene is not a behavior of the gene!!!! ' That's why I wrote 'behave'. However, in this case it means that the majority of these ORFans apparently do not bind to an important tumor supressor protein, are never overexpressed in any type of cancer, not really linked to one of the important signalling cascades, are not or do not interact with transcription factors, are not part of metabolic pathways etc. ... Else one or the other of the thousands of young assistant professors out there desperately looking for a novel project might have picked one up. 'And it will show once again that neo-Darwinism is a bad paradigm to guide research.' But this paper is not a good illustration of this point since it basically indicates that there are ~ 18.000 gene candidates that are closely related to known, 'normal' genes and ~1.200 that somehow differ from those which deserve extra scrutinity. They even discuss the possibility that these are novel human genes. (And they are not biased against the possibility that there are novel human genes since the include ORFans that have human paralogs in their final lists). In the end they reject this possibility for themselfes, but this is hardly a 'impeding research'. (There are a lot of people who would just love to prove Eric Lander wrong just for personal reasons.)rna
January 24, 2008
January
01
Jan
24
24
2008
08:17 PM
8
08
17
PM
PDT
johnnyb, "However, you cannot say a priori that a designer prefers elegant, modular designs." I suggest that, base upon my experience as a designer of complexity, complexity requires elegance and modularity. I contend that this is not a requirement of a designer, but is required by complexity itself.bFast
January 24, 2008
January
01
Jan
24
24
2008
09:48 AM
9
09
48
AM
PDT
Lars: "Is ID allowed to make assumptions (some would say theological assumptions) about the goals of a putative designer, while still retaining status as an empirical method?" Yes and no. On the one hand, we cannot say "the designer would do X" without making a lot of unwarranted assumptions about the designer. On the other hand, we _can_ say with a fair degree of certainty that "X requires a designer who is attempting to do it". So, if you have, say, an elegant, modular design, the simplified argument is that such a thing requires a designer looking for those attributes. However, you cannot say a priori that a designer prefers elegant, modular designs. We generally do, but we can't necessarily make that assumption of another.johnnyb
January 24, 2008
January
01
Jan
24
24
2008
07:50 AM
7
07
50
AM
PDT
larrynormanfan: There is a lot of evidence for this in general in nature, and some evidence in humans. In the bacterial world, you have things like environmentally-triggered transposons which reconfigure the genome to use different food sources. In the human genome, you have the immune system, which has the known basic ability to do this on a really small scale - Somatic Hypermutation tailors existing antibody genes to match the antigens, and some researchers think that this is reverse-transcribed back into the germ line. There are lots of little bits of evidence like that. We often forget how little we actually know about how the genome works, and how it changes generation-to-generation. There is quite a bit of evidence that on a small scale the genome knows how to change itself (see anything by Caporale or B.E. White, or most anything by Shapiro). There is some evidence that it can be transcribed back into the germ line (see the book Lamarck's signature). The extent to which these can occur is unknown, and most biologists actually leave these ideas out of their conceptualizations entirely (They assume, probably because it is so teleological, that if it occurs it is a minor part of living systems, not the major part).johnnyb
January 24, 2008
January
01
Jan
24
24
2008
05:18 AM
5
05
18
AM
PDT
Great article; thanks for pointing this out. Good to have significant ID predictions, and awareness of how Darwinian thinking "confirms" itself (except when it does not). DLH:
Design Principles: An Intelligent Designer will likely combine robust design with efficient design.
Is ID allowed to make assumptions (some would say theological assumptions) about the goals of a putative designer, while still retaining status as an empirical method? By doing so, doesn't it open the door to the whole panda's-thumb argument, which relies on similar assumptions about the designer's goals, such as optimality? Just curious... is that the position taken by the ID movement in general: that ID predicts optimal design, or robust design, or efficient design, and would be disconfirmed by the opposite? rna (25), thanks for that clarification. Sounds like we should not be so quick to pronounce Darwinists as circular reasoners... (acknowledging johnnyb's caveat [29] that there is still a bias introduced). Sounds like it's more the press who are eager to confirm Darwinism no matter the basis, and the scientists who did the research fail to correct them.lars
January 24, 2008
January
01
Jan
24
24
2008
03:33 AM
3
03
33
AM
PDT
johnnyb,
If the genome is responsible for its own changes (i.e. it has the ability to deploy new proteins on the fly in response to need), then there would be no need, even in the case of common descent being true, that novel proteins need to arise gradualistically.
Experimental evidence for this supposed ability? Because it sounds an awful lot to me like "If a frog had wings, he wouldn't bump his butt when he hopped."larrynormanfan
January 24, 2008
January
01
Jan
24
24
2008
02:34 AM
2
02
34
AM
PDT
Paul Giem - your idea of using antibodies is close to PZ Myer's suggestion of screening an expression library (I'll be generous, and not cause the server to get conniptions by linking to Pharyngula). The problem with these suggestions is that the genes may not be expressed in those organs or at that time. If they are involved in development, say, or if they are induced by an environmental cue, then you might just miss them. I would agree that this is the sort of thing that ID scientists should be doing - even if the ID aspect were found to be false, you would still get some useful information. This always helps in the grant aplications! BobBob O'H
January 23, 2008
January
01
Jan
23
23
2008
11:23 PM
11
11
23
PM
PDT
"I think what they presume is that if a gene is not found in the other mammalian genomes this gene assignments just warrants another look." But note - they are not providing additional scrutiny for _all_ genes, just additional scrutiny for genes that shouldn't be there by evolutionary presuppositions. "Second they look at the existing literature" The lack of study by other researchers isn't evidence. "The important point here is that the safely assigned genes again behave differently then the questionable candidates. The majority of those hase been studied, whereas only a tiny fraction of the questionable genes was verified. So the questionable genes again ‘behave’ differently from the other genes." The lack of research by scientists on a particular gene is not a behavior of the gene!!!! Has it occurred to you that the reason they aren't well-studied is because mice are much easier to study than humans? Thus, the correlation between ORFans in humans and examination could simply be one of selection bias. Remember, these are _extra_ hurdles that ORFans have to go through that other genes do not. "My prediction would actually be that many biologists jump on these genes" I hope so. And it will show once again that neo-Darwinism is a bad paradigm to guide research.johnnyb
January 23, 2008
January
01
Jan
23
23
2008
08:03 PM
8
08
03
PM
PDT
This whole subject is fascinating. We have here a clear neo-Darwinian deduction--creating proteins de novo is very hard, and there is no way that we can have 5,000 (or is it 1,711) new genes just suddenly pop up fully formed in humans. On the other hand, it is quite conceivable that an intelligent designer might decide that a different creature, with different genes, would be desirable, and figure out a way to add in the extra genes. This would not be mandatory, but certainly within the realm of possibility, and in some cases might be expected. Furthermore, because of the underlying reasons, and now because of the article, Darwinian scientists will stop looking for functions for these ORF's. So anyone who believes in the possibility of intelligent design can look for functions for these ORF's without worrying about 50,000 molecular biologists breathing down his or her neck. I have a hard time thinking of a better way to do research. Here I agree with DLH (2) and not with Shaner74 (17); this is not terrifying, but rather exhilarating. (BTW, Bob O'H [13] and DaveScot [15], there's a way to do this without getting one's lab coat dirty; it's called graduate students ;) ) The knockout studies proposed by Bob and Dave will work with mice, but ethics problems prevent them from being used in humans. However, there is an easier way. Simply transcribe (either directly or by means of the code and protein synthesis) the ORF's (that's why they're called ORF's, because one can actually read them) and then create tagged antibodies to the resulting peptides, and see whether those antibodies stain any tissue. If so, one has a clue as to where to find the protein expressed. Mapo (40), keep in mind that, in contrast to some parts of the genome, these ones can actually be translated into protein, or at least polypeptides. This will serve several functions: 1. it will give the lie to the assertion that inteligent design does not suggest research, and therefore is not science. 2. if successful, it will strongly argue that ID is good science. 3 if successful, it will strongly argue that neo-Darwinism is in fact a "science-stopper", and do so in a way that is intuitively obvious to most intelligent high-schoolers. I say, let's go for it! Michaels7 (38, 39) has some great ideas for financing.Paul Giem
January 23, 2008
January
01
Jan
23
23
2008
02:35 PM
2
02
35
PM
PDT
johnnyb #29: I think what they presume is that if a gene is not found in the other mammalian genomes this gene assignments just warrants another look. This is a reasonable assumption to establish a base line of safely assigned genes because we share a considerable set of genes with other mammals. To the remaining ~1200 gene candidates they apply some tests. One shows for instance that these genes would code in their majority for proteins which are significantly shorter then normal human proteins. So these gene candidates are somehow different from the 'safe' genes. Second they look at the existing literature. The important point here is that the safely assigned genes again behave differently then the questionable candidates. The majority of those hase been studied, whereas only a tiny fraction of the questionable genes was verified. So the questionable genes again 'behave' differently from the other genes. The result of this test was open. If the questionable genes would have a similar length compared to the verified genes and if many of them would be already experimentally confirmed, then the occurence of similar genes in other mammals would not be a useful measure in their study for verifying gene assignments. Furthermore, they include a number of human specific orphan genes in their revised catalogue, if there is any other supporting evidence that these are real genes. So they are not trying to throw out candidate genes just because they are specific to humans. Finally, their conclusion basically is: There are this ~1200 sequences where we do not think that this are genes. So if you want to convince us to include these in our catalog you are better be ready to show us that they result in the expression of a protein. And anyone is welcome to do so. My prediction would actually be that many biologists jump on these genes because if anyone can show that one of these genes code for a human specific protein and maybe even show how this protein modulates or creates biochemical pathways in a uniquely human way, this is a paper in 'nature' and all the research biologists I know would give their right arms for that. Just as an example: even the identification of mutations specific to humans in an important gene such as Foxp2 are material enough for high profile publications.rna
January 23, 2008
January
01
Jan
23
23
2008
02:31 PM
2
02
31
PM
PDT
The genome is a massively parallel system, in which “genes” (to be re-defined Post-ENCODE) and “phenotypic elements” (e.g. obesity, along with uncounted other elements) are connected based on not a “one-to-one” but on a “many-to-many” principle, characteristic to massively parallel nonlinear systems, such as neural networks. January 16, 2008] This is all very interesting. This got me to thinking that any parallel system must encode a mechanism for communication (who talks to who) and another for timing (when do things happen). I suspect that lot of the "non-encoding" sequences in the genome may be dedicated to just these sorts of duties.Mapou
January 23, 2008
January
01
Jan
23
23
2008
01:58 PM
1
01
58
PM
PDT
Just one more follow-up from Pellionisz(AJP)...
In 60% of strains, knocking out a gene produces mice that are nonviable; that is, the mouse cannot survive without the knocked out gene. The Monell survey revealed that body weight was altered in over a third of the viable knockout stains; 31 percent weighed less than controls (indicating that the missing genes contribute to heavier body weight), while another 3 percent weighed more (contributing to lighter weight). Extrapolating from the total number of genes in the mouse genome, this implies that over 6,000 genes could potentially contribute to the body weight of a mouse. Tordoff comments, "It is interesting that there are 10 times more genes that increase body weight than decrease it, which might help explain why it is easier to gain weight than lose it." [This comment implies that 600 genes influence gain loss - AJP] Because body weight plays a role in many diseases, including hypertension, diabetes, and heart disease, the implications of the findings extend beyond studies of obesity and body weight. Gene knockouts reported to affect these diseases and others could potentially be due to a general effect to lower body weight. The findings also hold clinical relevance, according to lead author Danielle R. Reed, PhD, a Monell geneticist. "Clinicians and other professionals concerned with the development of personalized medicine need to expand their ideas of genetics to recognize that many genes act together to determine disease susceptibility." Maureen P. Lawler also contributed to the study which is published online in the journal BMC Genetics. [This is the death knell of yet another dogma of Pre-ENCODE Genomics, the "one gene, one phenotype, one billion dollar pill" myth. The genome is a massively parallel system, in which "genes" (to be re-defined Post-ENCODE) and "phenotypic elements" (e.g. obesity, along with uncounted other elements) are connected based on not a "one-to-one" but on a "many-to-many" principle, characteristic to massively parallel nonlinear systems, such as neural networks. January 16, 2008] (email omitted)
http://www.junkdna.com/#mattick_goes_commercialMichaels7
January 23, 2008
January
01
Jan
23
23
2008
01:00 PM
1
01
00
PM
PDT
And I'd add that depending upon environmental triggers, the main function may or may not reference the "database target distenders or enlargers." From mice gene knockout experiments...
The Monell survey revealed that body weight was altered in over a third of the viable knockout stains; 31 percent weighed less than controls (indicating that the missing genes contribute to heavier body weight), while another 3 percent weighed more (contributing to lighter weight).
As well as the following from JunkDNA debunker and fractal frontier walker Pellionisz... I really enjoy how he goes after the darkened towers of Mordor, wher orchs march lock-step without thinking. John Mattick, professor of Molecular Biology at the University of Queensland added, “It appears that we have misunderstood the nature of genetic programming in humans and other complex organisms. Most of the genome is transcribed, mainly into non-coding RNAs, which appear to comprise a hidden layer of gene regulation whose full dimensions are just beginning to be explored.” Invitrogen will commercialize these sequences over the next few years, allowing the company to expand its NCode™ microRNA microarray product line into the field of non-coding RNA profiling. Thus, for the first time, a commercial tool will be available to help scientists to identify the large complement of non-coding RNAs and study their function. Fun times indeed. Close to the Edge, Down by the River....
[Perhaps the biggest of Venter's accomplishments is that by "going commercial" he has proven the 300 year-old dogma that scientific research is led by governments. Government, by definition built on consensus, simply fails to support the leading edge of "PostModern Era" of genomics in an adequate manner. This became most evident in the USA by Venter - but in other global regions now it is similar. Prof. Akoulitchev left Oxford University for his own company in the UK - outsourcing to India, Singapore and Silicon Vallley, and now Mattick (of Sidney, Australia) "went commercial" for reasons he knows best. In the USA, this trend is well-established beyond Venter. This columnist [AJP] never even tried the impossible, to run against the establishment on "government support"; accumulated IP "clean as a whistle" totally separate from any kind of entanglements. Prof. George Church (Harvard) "went commercial" with his Knowme (see news items below) - within 3 days of China's announcement of global sequencing services outsourcing it to China. Now, Mattick does the same; going commercial with explosive R&D, establishing a vital Australia-California linkage. (email omitted junkdna.com, January 17, 2008]
Email is documented for any contact, but I edited it out. Go commercial gentlemen. Go round the slow beast, the dinosaur that refuses to recognize its own extinction. Slay the serpernt with dexterity and get into the patent game as quickly as possible. I'm thinking of a CEO, that might fund such projects if the investment pays off. Non-coded areas have shown to be responsible for disease too. How many are unknown due to simple lack of consensus research? Not all 49ers struck a golden vein, but at least they staked their claims in the rush that was inevitable to make many rich. I'm not advocating post-genetic research for becoming rich, but reality is to break thru barriers you must do so by many different methods. I remember ID put forward a business lecture on entrepreneurs and ID. If you're able to break down the wall by patentable research that leads to disease control or eradication, then the fruits will enable greater research and more workers on the team. But, you need Pre-Sales and Sales. You got the brains and engineers, but you lack door to door because you're not allowed in the door to begin with. As the world grows more competitive, opportunities are even greater now for those willing to take the risk. I'm betting on functional design in non-coded areas. So are many other researchers... http://www.junkdna.com/#mattick_goes_commercialMichaels7
January 23, 2008
January
01
Jan
23
23
2008
12:50 PM
12
12
50
PM
PDT
rna, thanks for clarifying the difference between the study and the report. It is clear to me that the scientists, as usual, are clearer thinkers than the guys that report on them. What to do with 5000 segments of DNA whose purpose is not established is an interesting question. I think it absolutely reasonable to consider that many of these ORFs have no function. Let the ID researchers prove the varacity of their theory in the lab. Lets find function for these 5000 ORFs, and inform the world of the practicality of the theory.bFast
January 23, 2008
January
01
Jan
23
23
2008
12:46 PM
12
12
46
PM
PDT
Mapou:
One of the things I learned from working with complex computer software applications over the years is that any variation in the code, no matter how slight, is an absolute no-no because the likelihood of it resulting in a catastrophic malfunction increases with complexity.
My experience with software is that the more clearly structured the code, the more amenable it is to modification. Ultimately complexity is at tension complicated -- the less complicated the code, the more complexity can be achieved. Alas the same must be true of DNA code -- to support advancement it must have a disciplined structure. It is clear that DNA code does have some structural discipline. Yet in a way it also still seems "wired for sound". I predict that as we get to understand DNA better we will find that the current appearance that it is "wired for sound"is simply the product of our current ignorance of its structure -- which will prove elegant.bFast
January 23, 2008
January
01
Jan
23
23
2008
12:37 PM
12
12
37
PM
PDT
Mapou: Yes! Thank you.johnnyb
January 23, 2008
January
01
Jan
23
23
2008
12:23 PM
12
12
23
PM
PDT
JohnnyB: However, a non-materialist is forced to only giving materialist answers, no matter how much of a stretch it is to do so. Thanks for the explanation. Did you mean to write "a materialist is forced to only giving materialist answers"?Mapou
January 23, 2008
January
01
Jan
23
23
2008
11:55 AM
11
11
55
AM
PDT
Knocking out Code is the first step, but another function may be as extenders of size depending upon environmental conditions and diet. Maybe even such functions as longer snouts in dogs for example. Longer legs etc., even bigger beaks. So if you knock out something you do not lose the function altogether, but the extension of that function.Michaels7
January 23, 2008
January
01
Jan
23
23
2008
11:51 AM
11
11
51
AM
PDT
Bad wrote: Redundancy is only useful for that purpose if it is conserved, rather than allowed to drift over time. It is, however, a good avenue for evolutionary innovation, since the old gene product is retained, playing its former function, while a similar but not quite the same copy is then freer to vary, possibly hitting upon a better functional fit or a new function. One of the things I learned from working with complex computer software applications over the years is that any variation in the code, no matter how slight, is an absolute no-no because the likelihood of it resulting in a catastrophic malfunction increases with complexity. This, of course, does not preclude variations in the application's behavior, i.e., its interaction with its environment. However, such variations, if any, must be carefully controlled and anticipated. I would expect that the same would be true of any complex mechanism, including the genome.Mapou
January 23, 2008
January
01
Jan
23
23
2008
11:46 AM
11
11
46
AM
PDT
Mapou: "So, what you are saying is that ID theory makes no assumptions about design or no design. It goes where the data leads." I agree in general, but think that your wording might be a little overly myopic view of the situation. Many people in the design community are in fact predisposed towards design. But that is precisely why ID is developing empirical methods -- so that the inference of design moves from being a private judgment call to a publicly demonstrable item. Another thing to note is this -- non-materialists are always free to use materialistic explanations whenever they fit. Non-materialists do not deny the material world, they just don't think that the material world is all-inclusive. So the non-materialist is free to use a material explanation wherever it is warranted. However, a non-materialist is forced to only giving materialist answers, no matter how much of a stretch it is to do so. They have no freedom in this regard. Therefore, even though biases exist on both sides, non-materialists are in fact in a much more objective position to weigh the evidence, given that they do not have to change their entire worldview to choose different types of causes.johnnyb
January 23, 2008
January
01
Jan
23
23
2008
10:55 AM
10
10
55
AM
PDT
Sorry, but is it possible that you could re-write the article in light of what vestigial actually means, as well as something at least partially resembling mainstream biology's take on junkDNA (for instance: acknowledging that the original "Darwinian" expectation for the genome would be that EVERY part was highly tuned and functional?)? I think everyone would benefit.
ajl: that is exactly what I was thinking. Would it be difficult to just remove the orphaned genes from the mice, and see what they produce after a few generations. If there is no difference then I suppose the genes really do have no function.
Good idea. And in fact, these sorts of experiments, done more than decade ago and before, were what helped establish the basics of the "we should probably demonstrate a discrete function before declaring it functional" school of thought that the term "junkDNA" actually represents in biology.
Mapou: JPCollado, is it possible that so-called “junk tandem repeat DNA” might be part of a quality control mechanism within the genome? In the information sciences, redundancy is often used in mission-critical applications as an effective way to increase robustness.
Redundancy is only useful for that purpose if it is conserved, rather than allowed to drift over time. It is, however, a good avenue for evolutionary innovation, since the old gene product is retained, playing its former function, while a similar but not quite the same copy is then freer to vary, possibly hitting upon a better functional fit or a new function.Bad
January 23, 2008
January
01
Jan
23
23
2008
10:50 AM
10
10
50
AM
PDT
rna: There are multiple problems with the procedure: First of all, an ORF is _presumed_ non-genic if it isn't inline with evolutionary explanations. This IS the exact problem we've had with vestigality throughout biology. When something is presumed non-functional, few people bother to look. Look in particular at this quote: "we reviewed the scientific literature for published articles mentioning the orphans to determine whether there was experimental evidence for encoded proteins" The criteria was whether or not people had _already published_ on the ORF. This is just another way of assuming the conclusion. If we don't know if an ORF is a protein, it is usually because it hasn't been studied! So they are using the fact that an ORF hasn't been studied as evidence that it is non-functional! This is particularly problematic with the human genome, as it is more difficult to test for functionality in complex cases where it would be unethical to do the required experiment. On top of that, at the beginning of the paper, they actually _suggested_ that the medical community skip over the ORFs that they were excluding when researching genetic diseases. So the cataloguing actually makes a difference. Now, what they could have done, which would have been much more sane, is to say, "let's _only_ include genes which we have experimental evidence of functioning as genes into our catalog." But they didn't. They included ORFs for which there is no evidence of function in the catalog, provided that they are evolutionarily conserved! If you think hard you can see all sorts of problems with this. To begin with, the neo-Darwinistic view of the genome where the protein-coding genes are the most important is again rearing its ugly head. They are assuming that if it is evolutionarily conserved, it must be coding proteins. How preposterous in this day and age to assume that! Many important parts are in the regulation. But neo-Darwinism must live with simplistic systems since they are only allowed simplistic mechanisms.johnnyb
January 23, 2008
January
01
Jan
23
23
2008
10:47 AM
10
10
47
AM
PDT
larrynormanfan: This has nothing to do with common descent. If the genome is responsible for its own changes (i.e. it has the ability to deploy new proteins on the fly in response to need), then there would be no need, even in the case of common descent being true, that novel proteins need to arise gradualistically. Likewise, if the genome is programmed to change and diverge, there is no limit to the amount of creativity in a single generation that could be programmed in. Also, if you take the progressive approach (the designer had multiple design activities, but still used birth as the mechanism to go from one generation to the next), then there is no reason to assume that the changes must be small. The idea that the changes must be small and conserved through evolution ONLY arises through neo-Darwinistic assumptions.johnnyb
January 23, 2008
January
01
Jan
23
23
2008
10:37 AM
10
10
37
AM
PDT
JPCollado, is it possible that so-called "junk tandem repeat DNA" might be part of a quality control mechanism within the genome? In the information sciences, redundancy is often used in mission-critical applications as an effective way to increase robustness.Mapou
January 23, 2008
January
01
Jan
23
23
2008
10:33 AM
10
10
33
AM
PDT
When Dawkins writes that creationists should spend "some earnest time speculating on why the Creator should bother to litter genomes with untranslated pseudogenes and junk tandem repeat DNA" - is he, perchance, seriously offering us a litmus test whereby creationism (sic), or a postulate thereof, would be open to vindication where the opposite is found to be true? If not, then why make a statement that neither its confirmation nor refutation helps to prove or advance a point? I also wonder if statements like these would ever be retracted, in the name of professionalism and cordiality, with an earnest nod to "creationists."JPCollado
January 23, 2008
January
01
Jan
23
23
2008
10:01 AM
10
10
01
AM
PDT
So this is what the paper (and not the press release) is actually saying: 'Experimental Evidence of Encoded Proteins. As an independent check on our conclusion, we reviewed the scientific literature for published articles mentioning the orphans to determine whether there was experimental evidence for encoded proteins. Whereas the vast majority of the well studied genes have been directly shown to encode a protein, we found articles reporting experimental evidence of an encoded protein in vivo for only 12 of 1,177 orphans, and some of these reports are equivocal. The experimental evidence is thus consistent with our conclusion that the vast majority of nonconserved ORFs are not protein-coding. In the handful of cases where experimental evidence exists or is found in the future, the genes can be restored to the catalog on a case-by-case basis.' and: 'Specifically, we propose that nonconserved ORFs should be INCLUDED in the human gene catalog if there is clear experimental evidence of an encoded protein.' (capital letters mine) I think that this is a rather sensible approach and along the lines of what many people here have proposed. These guys tried to establish a lower limit of genes with a high level of confidence and have a starting point to add new genes. So the press release appears to be really distorting.rna
January 23, 2008
January
01
Jan
23
23
2008
09:50 AM
9
09
50
AM
PDT
ajl: I’m not a biologist, so this seems like a no brainer to me (we do this is my field of science all the time - knock out some function to see how sensitive the overall model was to that function). The problem I see with this is that the orphan genes may be normally switched off by regulatory sequences. If so, one would observe no adverse effect from knocking out those genes. If it is found that the genes are not expressed, the next step would be to turn them on and see what happens. However, in such a case, I'm not sure how one would interpret the results, good or bad.Mapou
January 23, 2008
January
01
Jan
23
23
2008
09:23 AM
9
09
23
AM
PDT
@ DaveScot: knock out a bunch of orphan genes in a mouse and see what happens to the GM mice Dave, that is exactly what I was thinking. Would it be difficult to just remove the orphaned genes from the mice, and see what they produce after a few generations. If there is no difference then I suppose the genes really do have no function. But, if after the 3rd or 4th generation, I wonder if we would start to see anamolies in the mice. Another experiment would be to cross breed the knock-out mice with regular mice that have the orphaned genes and see what happens. I'm not a biologist, so this seems like a no brainer to me (we do this is my field of science all the time - knock out some function to see how sensitive the overall model was to that function).ajl
January 23, 2008
January
01
Jan
23
23
2008
09:03 AM
9
09
03
AM
PDT
JohnnyB, thanks for that clarification on ID. So, what you are saying is that ID theory makes no assumptions about design or no design. It goes where the data leads. By contrast, Darwinists always assume a naturalistic explanation for the data. That's a foolish way to do science.Mapou
January 23, 2008
January
01
Jan
23
23
2008
08:38 AM
8
08
38
AM
PDT
1 2

Leave a Reply