Uncommon Descent Serving The Intelligent Design Community

Cells process signals – by evolution or ID?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

The 2005 signal processing review by Berryman, Allison, Wilkinson, and Abbott provides a fascinating insight into how cells operate and how similar they are to human and computer processing! 🙂 Or did all this functionality this come about by “evolution”? 🙄 Consider:

Signal processing is the use of mathematical techniques to analyze any data signal. This data could be an image, a sound, or any other sequence of data, such a sequence of nucleotides. The sequences of interest could be protein coding regions, repeating elements that may be associated with various diseases (such as Huntington’s disease[7]) or regions rich in some set of complementary bases, such as A and T, which can give information on evolutionary history including lateral gene transfer in bacteria [8]. . . .
Signal processing is not just a human enterprise – even individual cells process signals in the form of mRNA, protein, and more general chemical levels (for example sugars in the environment) [16, 17, 18, 19]. As with conventional computers, cells can be genetically programmed to process signals [20, 21, 22]. As in electrical circuits, switching elements can be built in, and positive and negative feedback loops are present, enabling a range of behaviours to be programmed”, such as chemical oscillations of a predetermined frequency. Such engineered \gene circuits” could have important applications in gene therapies where where we wish to modify the existing protein and cellular interactions in an organism. . . .(emphasis added)

2.4. Linguistics
Since DNA and amino acid sequences can be thought of as a type of language, there is interest in the use of techniques from computational linguistics to analyze genetic sequences. This theory of grammar in a computational sense was fi rst developed by Chomsky [52, 53]. It has been applied to a wide range of applications in sequence analysis from determining gene structures [54] to RNA (ribonucleic acid) secondary structure [55]. Mantegna et al. have taken methods from statistical linguistics, along with information theory approaches, to consider diff erences between non-coding and coding DNA [56, 57, 58]. This reveals the presence of hidden information and extra redundancy in non-coding regions, perhaps due to lengthy promoter regions [59], or due to information left from now defunct coding regions. A good overview of linguistic techniques used can be found in Durbin et al. [13].

The PROSITE database contains a large number of protein families (related sequences), and their patterns, or \motifs” [60, 61]. This database can be searched using PROSITE patterns; an example of a pattern is
[ACFI]-[QC]-G-[AF]
where the capital letters denote amino acids. . . .

3.3. Time series analysis of gene expression data
Of interest to geneticists is not only what happens in the expression levels of two diff erent samples at a fixed point in time, but how the expression levels vary over a number of diff erent points in time. These experiments must be designed properly to ensure statistically signi cant information can be derived [79]. A number of diff erent signal processing techniques have been developed to analyze such data, as well as gene clusters” (sets of related genes) in microarrays. An overview of gene clustering algorithms can be found in Moreau et al. [81], and below we discuss some time series approaches.

M. J. Berryman, A. Allison, C. R. Wilkinson, and D. Abbott, “Review of signal processing in genetics,” Fluctuation and Noise Letters, Vol. 5, No. 4, pp. R13-R35, 2005.

Can signal processing be explained exclusively by the four forces of nature? 😕

On Berryman et al.’s discussion of “Linguistics”, see

Prof. Dr-Ing. Werner Gitt’s hierarchy of information in his book: In the Beginning was Information ISBN: 3-89397-255-2. Gitt develops five levels of information:

  • Fifth level Apobetics: Intended purpose & achieved result
  • Fourth level Pragmatics: Expected and implemented actions
  • Third level Semantics: Ideas communicated and understood
  • Second level Syntax: Code employed and understood
  • First level Statistics: Signal transmitted and received

Barryman et al’s review discuss the first level Statistics, and second level Syntax, possibly some of the 3rd level Semantics.

What basis can Darwinism provide for ANY of Gitt’s five levels of information?
We have clear current and historic evidence of intelligent agents being the direct cause of signal processing and computer programs. I find Intelligent Design provides a more satisfactory explanation for the signal processing found in cells than for the four stochastic forces of nature!

I recommend ID Practitioners explore the potential for using each of the methods reviewed by Barryman et al. to detect ID and distinguish it from stochastic processes or processes due to the laws of nature. Note:
  • Appendix A: Hidden Markov Models

In a hidden Markov model, the states are unknown and must be inferred from the data. . . .

  • Appendix B. Fourier Transforms

Fourier transforms are used in a wide range applications such as voice prints for evidence in criminal cases, compressing images, removing noise from music, and of course in DNA sequence analysis (Subsec. 2.2) . . .

  • Appendix C. Mutual Information

The mutual information function, introduced in Subsec. 2.3, for symbols at distance d apart is given in Eq. C.12, . . .

  • Appendix D. Spline Curves

The use of spline curves was introduced in Subsec. 3.3. With a spline curve, one approximates a curve using a set of basic functions (often polynomials) that are fitted to the function at a set of points where the function used to approximate the curve can change, but must meet certain speci cations (often ones designed to make the spline curve look smooth), and conditions are also specifi ed on the ends of the spline curve. . . .

See their signal processing review paper for all the equations etc.

Comments
DLH:
You may be referring to Shanon channel capacity in which a random signal has the same “information” as a highly prescribed signal.
I'm referring to Dembski's definition of specified complexity: χ = -log[10^120*φ(T)*P(T|H)] I had assumed that you were referring to Dembski's work, since you recommended his book. Are you using a different definition of specified complexity? Thanks for the pointer to Marks' presentation. I've read it before, and while I don't buy into the common philosophical arguments based on incompleteness/non-computability, they make for interesting reading. And I fully agree with Marks' position, "Theism or Atheism is not directly determined by intellect."R0b
September 13, 2009
September
09
Sep
13
13
2009
08:44 PM
8
08
44
PM
PDT
ROb ". . .all else being equal, simple patterns have more specified complexity than complicated patterns.. . .Under a hypothesis of pure noise, the 1812 Overture has the same complexity as any other signal." You may be referring to Shanon channel capacity in which a random signal has the same "information" as a highly prescribed signal. Prescribed specified complexity is the OPPOSITE of the pure noise of a random signal. Take N signals from the respective processes and add them. A signal with specified complexity will be preserved. A random signal of 0 and 1 will average out to 1/2 for each bit. There is a huge difference - despite the lack of conventional theory. A pure signal is the simplest, with the smallest amount of specified information required. It is NOT the most complex. PS you may enjoy R.J. Marks II, “Gödel to Turing to Chaitin to the Edge of Naturalism: Some Things Computers Will Never Do” B.E.A.R.S. (September 28, 2007), (ppt).DLH
September 13, 2009
September
09
Sep
13
13
2009
06:30 PM
6
06
30
PM
PDT
DLH, certainly the 1812 Overture smacks of design more than a simple periodic signal does. What I'm challenging is the notion that specified complexity captures this fact. On the contrary, all else being equal, simple patterns have more specified complexity than complicated patterns.
The physics of pulsars has been reasonably well known for some time.
Yes, thus my assumption that neither signal had a known explanation, i.e. the source of pulsar signals had not yet been discovered. If there's a ready natural explanation for something, then a specified complexity analysis yields a negative with zero confidence. That is, it tells us nothing.
Without a prior specification, I would tend to favor the second as more likely to be an intelligent signal as it has characteristics of specified complexity without chaotic features.
The problem is that the pulsar signal has more specified complexity than the 1812 Overture, by virtue of its simplicity.
From the level of analysis above, by applying the explanatory filter, I would infer the opposite, that the pulsar is natural not ID.
The EF cannot tell us that something is natural, as there is nothing to prevent false negatives.
Given evidence of the score for Tchaikovsky’s 1812 Overture as a complex specification,
How is the 1812 Overture more complex than any other signal of unknown origin, given Dembski's usage of the term complex?
Any probability practitioners who would care to make a first cut for the degree of complexity of the 1812 Overture?
Under a hypothesis of pure noise, the 1812 Overture has the same complexity as any other signal.R0b
September 12, 2009
September
09
Sep
12
12
2009
07:13 PM
7
07
13
PM
PDT
The remarkable precision of pulsars can be seen in the development of pulars as a time base with the potential of 10^-16 uncertainty. See: The Parkes Pulsar Timing Array Project R. N. Manchesterm AIPConf.Proc.983:584-592,2008, Feb. 2, arXiv.org arXiv:0710.5026v2DLH
September 12, 2009
September
09
Sep
12
12
2009
06:39 PM
6
06
39
PM
PDT
As CJYman notes, the explanatory filter is used to distinguish natural from intelligent causes. Pulsar's are primarily characterized by a principle repetition frequency. See the ATNF Pulsar Catalog. Note some frequencies are evaluated to 15 to 16 significant figures. A refinement would be a linear or exponential rate of decay. I.e., a pulsar can be modeled as two or three parameters. From just the number of parameters to model the signals, the first (pulsar) signal could be natural or intelligent, while the second signal (Tchaikovsky’s 1812 Overture) is highly unlikely to be natural. Then natural law comes to bear. The physics of pulsars has been reasonably well known for some time. e.g. A Model of Pulsars, Sturrock, P. A. Astrophysical Journal, vol. 164, p.529 Thus the first signal can readily be explained as a natural cause with no need to appeal to an intelligent cause. ROb Without a prior specification, I would tend to favor the second as more likely to be an intelligent signal as it has characteristics of specified complexity without chaotic features. From the level of analysis above, by applying the explanatory filter, I would infer the opposite, that the pulsar is natural not ID. Given evidence of the score for Tchaikovsky’s 1812 Overture as a complex specification, that provides a strong basis that the second signal is caused by an intelligent cause. Any probability practitioners who would care to make a first cut for the degree of complexity of the 1812 Overture?DLH
September 12, 2009
September
09
Sep
12
12
2009
06:20 PM
6
06
20
PM
PDT
CJYman, the EF is not a requirement in addition to the specified complexity criterion. Rather, it's a method for determining whether something exhibits specified complexity. Under a random noise hypothesis, a simple periodic signal is no more probable than the 1812 Overture. Why, then, do you say that it's low contingency?R0b
September 12, 2009
September
09
Sep
12
12
2009
03:43 PM
3
03
43
PM
PDT
Rob: "Assuming that the explanation for both signals is unknown, the design inference for the pulsar signal is stronger than the design inference for the 1812 Overture. Do you agree?" Yes, that is a possibility, which is why the Explanatory filter is also necessary. Since the pulsar can be described as a regularity (low contingency), and laws are mathematical descriptions of regularities, we would defer to a law-like process as an explanation for the pulsar.CJYman
September 12, 2009
September
09
Sep
12
12
2009
02:56 PM
2
02
56
PM
PDT
DLH: in your opinion, does it matter which two base sources you use to set your baseline? Does that affect the splitting point between natural and non-natural . . . if that's the right word. How many known natural and known designed sources should be analyzed to establish a base line? Is two enough?ellazimm
September 12, 2009
September
09
Sep
12
12
2009
02:20 PM
2
02
20
PM
PDT
The number of frequencies and amplitudes could be used as a measure of the specified complexity of the two signals.
Good idea. The "specified complexity" measure is inversely proportional to the descriptional complexity of the signal. Assuming that the explanation for both signals is unknown, the design inference for the pulsar signal is stronger than the design inference for the 1812 Overture. Do you agree?R0b
September 12, 2009
September
09
Sep
12
12
2009
08:43 AM
8
08
43
AM
PDT
PS I chose the the pulsar Vela for the first sound. Pulsar signals were originally labeled "LGM" for "Little Green Men" because they were considered potentially extraterrestrial intelligent signals. cf H. Paul Shuch, Standards of Proof for the Detection of Extra-Terrestrial Intelligence (1999) The SETI League. See Pulsar Sounds for other examples of pulsar sounds. The second signal was Tchaikovsky's 1812 Overture . At a lower level, we can see that it is scored for at least two dozen instruments. At a higher level the musical themes can be considered etc. See Tchaikovsky's musical score for another measure of prior specified complexity that could be compared with the audio frequency analysis.DLH
September 12, 2009
September
09
Sep
12
12
2009
07:20 AM
7
07
20
AM
PDT
ellazimm Good question Consider two signals. One signal exhibits a broad spectrum source with an extremely stable repetition rate. The second shows a complex structure . This reveals a number of acoustic sources distinguishable by their frequency distribution, which enter and leave with coordinated timing and collective variation in amplitude. From the frequency patterns could you distinguish either as from natural or intelligent causes? Here are frequency analyses typical the first category of source. To give further perspective, the signals can be analyzed from an ontological point of view where "there are 122 concepts in the ontology and over 170 relations and attributes." See: Creativity in Music as a Measure of Distances on Ontologies Jacques Calmet1, Anusch Daemi1, Stefan Kink1 and Thomas A. Troge2 It is possible to synthesize the different sources as a sequence of pure sounds with controlled amplitudes. e.g. see The Fourier Philharmonic Corresponding video signals could also be examined! The number of frequencies and amplitudes could be used as a measure of the specified complexity of the two signals. This could be used to quantitatively distinguish the two sources as natural vs ID. To apply these measures to distinguish between materialistic and intelligent causes, See William Dembski, No Free Lunch, Why Specified Complexity Cannot Be Purchased without Intelligence, (2002) Rowman & Littlefield Pub Inc. ISBN 0-7425-1297-5 Does this get you started?DLH
September 12, 2009
September
09
Sep
12
12
2009
07:11 AM
7
07
11
AM
PDT
How would you use Fourier Transforms to "detect ID"? What kind of process?ellazimm
September 11, 2009
September
09
Sep
11
11
2009
01:26 PM
1
01
26
PM
PDT

Leave a Reply