Information – a fundamental entity for understanding life!
Werner Gitt
German Federal Institute of Physics and Technology. Former Director and Head of Information Technology.
Dr.
Gitt obtained an undergraduate degree in engineering from the Technical
University of Hannover in 1968 and completed his Ph.D. summa cum laude
in 1970 from the Technical University of Aachen which also awarded him
its prestigious “Borchers Medal.” In 1971 Werner Gitt started his career
at the German Federal Institute of Physics and Technology in Brunswick,
being promoted to Director and Professor in 1978. He served as Head of
“Information Technology” from 1971 to 2002, when he retired. He is the
author of numerous research papers dealing with information science,
numerical mathematics, and control engineering. List his latest book and
mention his extensive speaking engagements – “he is a widely
sought-after author and speaker”.
Abstract
Good
science and logical deductions require that critical terms in any field
be unambiguously defined. Although everyone has an idea of the meaning
of the term “information,” it has not been unambiguously defined. After
working out a definition of information, I proceed to formulate
scientific laws for this nonmaterial entity, information, from which it
is possible to draw sound conclusions. These laws exclude the
possibility that information, including biological information, can
arise purely from matter and energy without reference to an intelligent
agent. As such these laws show that the neo-Darwinian theory of
evolution cannot in principle account for the most fundamental
biological phenomenon. In addition, the laws here presented give
positive ground for attributing the origin of biological information to
the conscious, willful action of a sender. The far-reaching implications
of these laws are discussed.
Abstract
It
is commonly argued that the spectacular increase in order which has
occurred on Earth does not violate the second law of thermodynamics
because the Earth is an open system, and anything can happen in an open
system as long as entropy increases outside the system compensate the
entropy decreases inside the system. However, if we define “X-entropy” to
be the entropy associated with any diffusing component X (for example, X
might be heat), and, since entropy measures disorder, “X-order” to be
the negative of X-entropy, a closer look at the equations for entropy
change shows that they do not only say that the X-order cannot increase
in a closed system, they also say that in a open system the X-order
cannot increase faster than it is imported through the boundary. Thus
the equations for entropy change do not support the “compensation” idea,
they instead illustrate the tautology that “if an increase in order is
extremely improbable when a system is closed, it is still extremely
improbable when the system is open, unless something is entering which
makes it not extremely improbable.” Thus unless we are willing to argue
that the influx of solar energy into the Earth makes the appearance of
spaceships, computers and the internet not extremely improbable, we have
to conclude that the second law has in fact been violated here.
A second look at the second law
Granville Sewell
Mathematics Department, University of Texas, El Paso.
Granville Sewell is Professor
of Mathematics at the University of Texas at El Paso (UTEP). He
completed his Ph.D in Mathematics at Purdue University, and has
subsequently been employed by (in chronological order) Universidad Simon
Bolivar (Caracas), Oak Ridge National Laboratory, Purdue University,
IMSL Inc. (Houston), UTEP, The University of Texas Center for High
Performance Computing (Austin), and Texas A&M University, and is
currently back at UTEP. He spent one semester (Fall 1999) teaching at
Universidad Nacional de Tucuman in Argentina, on a Fulbright grant, and
returned to Universidad Simon Bolivar to teach summer courses in 2005
and 2008. Sewell has written three books on numerical analysis, and is
the author of a widely-used finite element computer program (video at www.roguewave.com/pde2d).
Pragmatic Information
John W. Oller, Jr.
Hawthorne Regents Professor IV, Department of Communicative Disorders, University of Louisiana at Lafayette.
John
W. Oller, Jr., Ph.D. founded the Department of Linguistics at the
University of New Mexico in 1972 and the Applied Language and Speech
Sciences Ph.D Program at UL Lafayette in 2001. Oller’s research has
concentrated on the theory and experimental measurement of linguistic
processes in education, high stakes testing, the diagnosis of disorders,
the success of social interactions, and more recently on genetic
systems, biochemistry, repair and disease defenses, etc. Winner of the
Mildenberger Prize offered by the Modern Language Association, Oller is
the author of over 200 peer-reviewed papers and monographs along with 16
books largely in experimental measurement and research on theories of
linguistics and sign systems in general. His 2010 works include a book
on the causes of autism, an encyclopedic reclassification of
communication disorders and related disease
conditions,
and a monograph-sized contribution to the peer-reviewed
multidisciplinary open source journal Entropy. The latter deals with the
process of pragmatic mapping (as in referring to an object, person,
event, relation, or sequence of them) and as found in genetics, the
dynamics of immune systems, and the distinct neuroarchitecture of the
human brain.
Abstract
Common
true narrative representations (TNRs), such as “I had yogurt for
breakfast,” or any viable expression of a genome in an organism,
relative to all other representations, are “perfect” (complete). They
are deeply layered and pervasively interdependent as Sanford and others
show. Logicomathematical proofs show why mutating genetic TNRs with
toxins, viruses, and radiant energy must lead eventually to disorder,
mortality, and even extinction. Deeply interdependent TNRs are
prerequisite to the biosphere. The unique perfections of TNRs cannot be
found in fictions, errors, lies, or nonsense fragments. Those
perfections refute neo-Darwinism. Pragmatic information, absolutely
dependent on TNRs, is provably infinitely complex, abstract, eternal,
and immaterial. TNRs connect with each other and generalize to the whole
material universe. Proofs show that TNRs, provide the only basis for
communication, valid measurements, mathematical representations, and
life itself. Also, each presupposes the rest, so they cannot arise
piecemeal.
Multiple overlapping genetic codes profoundly reduce the
probability of beneficial mutation
George D. Montañez
BS Computer Science, University of California -- Riverside (2004), MS Computer Science, Baylor University (2011)
George
D. Montañez is a graduate student in the Machine Learning
department, School of Computer Science, at Carnegie Mellon
University. His research interests include predictive state model
reconstruction, information properties of genetic
algorithms, conservation of information in machine learning, and
machine learning methods for textual data mining. He served as a
research assistant to Dr. Robert J. Marks II at Baylor University.
Abstract
There
is growing evidence that much of the DNA in higher genomes is
poly-functional, with the same nucleotide contributing to more than one
type of code. Such poly-functional DNA should logically be multiply
constrained in terms of the probability of sequence improvement via
random mutation. We describe a model of this relationship, which relates
the degree of poly-functionality and the degree of constraint on
mutational improvement. We show that: a) the probability of
beneficial mutation is inversely related to the degree that a sequence
is already optimized for a given code; b) the probability of beneficial
mutation drastically diminishes as the number of overlapping codes
increases. The growing evidence for a high degree of optimization in
biological systems, and the growing evidence for multiple levels of
poly-functionality within DNA, both suggest that mutations which are
unambiguously beneficial must be especially rare. The theoretical
scarcity of beneficial mutations is compounded by the fact that most of
the beneficial mutations that do arise should confer extremely small
increments of improvement in terms of total biological function. This
makes such mutations invisible to natural selection. Beneficial
mutations which are below a population’s selection threshold are
effectively neutral in terms of selection, and so should be largely
unproductive from an evolutionary perspective. We conclude that
beneficial mutations that are unambiguous (not deleterious at any level)
and useful (subject to natural selection) should be extremely rare.
Abstract
This
paper provides a general framework for understanding targeted search.
It begins by defining the search matrix, which makes explicit the
sources of information that can affect search progress. The search
matrix enables a search to be represented as a probability measure on
the original search space. This representation facilitates tracking the
information cost incurred by successful search (success being defined as
finding the target). To categorize such costs, various information and
efficiency measures are defined, notably, active information.
Conservation of information characterizes these costs and is precisely
formulated via two theorems, one restricted (proved in previous work of
ours), the other general (proved for the first time here). The
restricted version assumes a uniform probability search baseline, the
general, an arbitrary probability search baseline. When a search with
probability q of success displaces a baseline search with probability p
of success where q > p, conservation of information states that
raising the probability of successful search by a factor of q/p (> 1)
incurs an information cost of at least log(q/p). Conservation of
information shows that information, like money, obeys strict accounting
principles.
A general theory of information cost incurred by successful search
William A. Dembski
Discovery Institute, 208 Columbia Street, Seattle, WA 98104.
William
A. Dembski received the B.A. degree in psychology, the M.S. degree in
statistics, the Ph.D. degree in philosophy, and the Ph.D. degree in
mathematics in 1988 from the University of Chicago, Chicago, IL, and the
M.Div. degree from Princeton Theological Seminary, Princeton, NJ, in
1996. He was an Associate Research Professor with the Conceptual
Foundations of Science, Baylor University, Waco, TX. He is currently
also a Senior Fellow with the Center for Science and Culture, Discovery
Institute, Seattle, WA. He has held National Science Foundation graduate
and postdoctoral fellowships. He has published articles in mathematics,
philosophy, and theology journals and is the author/editor of more than
a dozen books.
Abstract
Tierra
is a digital simulation of evolution for which the stated goal was the
development of open-ended complexity and a digital “Cambrian Explosion.”
However, Tierra fails to produce such a result. A closer inspection of
the programs produced by the process of evolution within Tierra shows
very few instances of adaptation through novelty. Instead, most changes
result from removing or rearranging the existing pieces within a Tierra
program. The open-ended development of complexity depends on the ability
to generate novelty, but Tierra fails on precisely that point.
Tierra: wasteland of novelty
Winston Ewert
Electrical & Computer Engineering, One Bear Place #97356, Baylor University, Waco, TX 76798-7356
Winston
Ewert received the B.Sc. in Computer Science from Trinity Western
University in Langley, B.C., and a Ph.D. at Baylor University where he
was a member of Evolutionary Informatics Lab. Together with Dr. Robert
Marks, Dr. William Dembski, and George Montañez, he is an author on a
number of papers investigating the informational content of
evolution-inspired search algorithms. He now works as a Software
Engineer.
di
Biologia / Biology Review, and he has co-authored articles in
Development and Proceedings of the National Academy of Sciences USA. He
is also the author of several books, including Charles Hodge’s Critique
of Darwinism, Icons of Evolution and The Politically Incorrect Guide to
Darwinism and Intelligent Design, and he is the co-author (with William
Dembski) of The Design of Life. His most recent book, The Myth of Junk
DNA, was published in 2011.
Abstract
In
the 1950s Francis Crick formulated the Central Dogma of molecular
biology, which states (in effect) that DNA makes RNA makes protein makes
us. By 1970, however, biologists knew that the vast majority of our
genome does not encode proteins, and the non-protein-coding fraction
became known as “junk DNA.” Yet data from recent genome projects show
that most nuclear DNA is transcribed into RNAs – many of which perform
sequence-dependent functions in cells and tissues – so the notion of
“junk DNA” is obsolete, and the amount of information in the genome far
exceeds the information in protein-coding regions. Various
sequence-independent functions of non-protein-coding DNA and RNA have
also been proposed or demonstrated. In addition to summarizing newly
discovered sequence-dependent functions, this paper describes some
sequence-independent functions (such as the three-dimensional
organization of chromatin and nuclei) and asks whether they require us
to expand our concept of biological information beyond that which
applies to the specified complexity of nucleotide sequences.
Sequence-dependent and sequence-independent functions of "junk" DNA: do we need an expanded concept of biological information?
Jonathan Wells
Discovery Institute, Seattle WA 98104.
Jonathan
Wells holds an A.B. in Physical Sciences from the University of
California at Berkeley. In 1985 he received a Ph.D. in Religious Studies
from Yale University, with a dissertation on Charles Hodge and the
nineteenth-century Darwinian controversies. In 1994 he received a second
Ph.D. in Molecular and Cell Biology from the University of California
at Berkeley, with a dissertation on frog embryology. From 1995 to 1998
he worked as a hospital laboratory supervisor and did postdoctoral
research at Berkeley. He then moved with his family to Seattle, where he
is now a Senior Research Fellow at the Discovery Institute. He has
authored scientific articles in BioSystems, The Scientist, The American
Biology Teacher and Rivista
Abstract
Background. In
a companion paper we use careful numerical simulation to show that
there is a quantifiable selection threshold below which low impact
deleterious mutations escape purifying selection and therefore
accumulate without limit. In that study we developed the statistic, STd,
which is the mid-point of the transition zone between selectable and
unselectable deleterious mutations. We showed that under most natural
circumstances STd values should be surprisingly high, such that the
large majority of all deleterious mutations should be unselectable. Does
a similar selection threshold exist for beneficial mutations?
Methods.
As in our companion paper we here employ what we describe as genetic
accounting to quantify the selection threshold STb for beneficial
mutations, and we study how various biological factors combine to
determine its value.
Results.
In all experiments that employ biologically reasonable parameters, we
observe high STb values and a general failure of selection to
preferentially amplify the large majority of beneficial mutations. High
impact beneficial mutations strongly interfere with selection of all low
impact mutations.
Conclusions.
A selection threshold exists for beneficial mutations similar in
magnitude to the selection threshold for deleterious ones, but the
dynamics of that threshold are different. Our results suggest that for
higher eukaryotes, minimal values for STb are on the order of 10-4 or
higher. It appears very likely that most functional nucleotides in a
large genome have fractional contributions to fitness which are much
smaller than this. This suggests that given our current understanding of
how natural selection operates, we cannot explain the origin of the
typical functional nucleotide.
Selection threshold severly constrains capture of beneficial mutations
John Sanford
Department of Horticulture, NYSAES, Cornell University, Geneva NY 14456
John
Sanford has a Ph.D. in Plant Breeding/Genetics from the University of
Wisconsin. He has been a Cornell professor for 30 years, conducting
research in the areas of plant breeding, plant genetic engineering, and
theoretical genetics. John conducted plant genetic research that
resulted in many new crop varieties, more than 80 scientific
publications, and 30 patents. John was the primary inventor of the
biolistic “gene gun” process, which was used to produce a large fraction
of the transgenic crops grown in the world today. John was team leader
in the development of the program Mendel’s Accountant, the world’s first
biologically realistic forward time genetic accounting program. John is
the author of the book Genetic Entropy and the Mystery of the Genome.
John is now semi-retired from Cornell, and continues to hold the
position of Courtesy Associate Professor.
Abstract
There
is now abundant evidence that the continuous accumulation of
deleterious mutations within natural populations poses a major problem
for neo-Darwinian theory. It has been proposed that a viable
evolutionary mechanism for halting the accumulation of deleterious
mutations might arise if fitness depends primarily on an individual’s
“mutation-count.” This hypothetical “mutation-count mechanism”
(MCM) is tested using numerical simulation to determine the viability of
the hypothesis and to determine what biological factors affect the
relative efficacy of this mechanism. MCM is shown to be operational only
when all of the following circumstances prevail: 1) a very narrow range
of mutational effects; 2) truncation selection; 3) zero environmental
variance; and 4) sexual recombination. Therefore, MCM does not appear to
occur under biologically realistic conditions. MCM is thus not a viable
evolutionary hypothesis and is not capable of stopping deleterious
mutation accumulation in natural populations.
Using
numerical simulation to test the "mutation-count mechanism" for halting
deleterious mutation accumulation in natural populations.
Wesley Brewer
Fluid Physics International.
Wesley
Brewer is the sole proprietor of Fluid Physics International, a
small consultancy specializing in developing numerical simulation
software for modeling complex scientific phenomena. His primary research
area is in computational hydrodynamics, but has also been working in
computational genetics and numerical weather simulations. Since 2005, he
has been part of the Mendel’s Accountant development team. Dr. Brewer
holds a B.S. in engineering science and mechanics from the University of
Tennessee, an M.S. in ocean engineering from the Massachusetts
Institute of Technology, and a Ph.D. in computational engineering from
Mississippi State University. Since 2007, Dr. Brewer spends much of his
time teaching computer science in Korea
Abstract
The
process of deleterious mutation accumulation is influenced by numerous
biological factors, including the way in which the accumulating
mutations interact with one another. The phenomenon of negative
mutation-to-mutation interactions is known as synergistic epistasis
(SE). It is widely believed that SE should enhance selective elimination
of mutations and thereby diminish the problem of genetic degeneration.
We apply numerical simulation to test this commonly expressed assertion.
We find that under biologically realistic conditions, synergistic
epistasis exerts little to no discernable influence on mutation
accumulation and genetic degeneration. When the synergistic effect is
greatly exaggerated, mutation accumulation is not significantly
affected, but genetic degeneration accelerates markedly. As the
synergistic effect is exaggerated still more, degeneration becomes
catastrophic and leads to rapid extinction. Even when conditions are
optimized to enhance the SE effect, selection efficiency against
deleterious mutation accumulation is not appreciably influenced. We also
evaluated SE using parameters that result in extreme and artificially
high selection efficiency (truncation selection and perfect genotypic
fitness heritability). Even under these optimized conditions,
synergistic epistasis causes accelerated degeneration and only minor
reductions in the rate of mutation accumulation. When we included the
effect of linkage within chromosomal segments in our SE analyses, it
made degeneration still worse and even interfered with mutation
elimination. Our results therefore strongly suggest that commonly held
perceptions concerning the role of synergistic epistasis in halting
mutation accumulation are not correct.
Can synergistic epistasis halt mutation accumulation? Results from numerical simulation
John Baumgardner
Department of Earth and Environmental Sciences, Ludwig Maximilians University, Theresienstrasse 41, 80333 Munich, Germany.
Dr.
Baumgardner has a B.S. in electrical engineering from Texas Tech
University, a M.S. in electrical engineering from Princeton University,
and a Ph.D. in geophysics and space physics from UCLA. From 1984 to 2004
he served as a staff scientist in the Theoretical Division of Los
Alamos National Laboratory engaged in a variety of research projects in
computational physics. Beginning in 2004 he has been part of the team
which developed Mendel’s Accountant, a computer model for investigating
research topics in population genetics. He is currently an adjunct staff
scientist in the Department of Earth and Environmental Sciences at
Ludwig Maximilians University in Munich, Germany.
Computational evolution experiments predict a net loss of genetic information despite selection in biological organisms
Chase W. Nelson
Research Scientist,
Chase
W. Nelson is a biologist and musician currently pursuing a PhD in
bioinformatics and molecular evolution. He graduated from Oberlin
College in 2010, where he performed honors research on mutation
accumulation in Arabidopsis. While at Oberlin, he became an NSF STEM
Scholar in Computation and Modeling, and also took part in several
research experiences, including an NIH IDeA Networks of Biomedical
Research Excellence Fellowship at the University of Wyoming. He
subsequently worked under Dr. John C. Sanford at Rainbow Technologies,
Inc., where he examined the power of natural selection in digital
organisms. His current studies under Dr. Austin L. Hughes focus on
developing computational methods to detect natural selection at the
nucleotide level. His design of novel tools for next-generation
sequence
analysis and geographic information systems earned him an NSF GRFP
Award in 2013. During the summer of 2013, he also undertook an NSF EAPSI
Fellowship to study rice genetics under Dr. Wen-Hsiung Li at Academia
Sinica (中央研究院) in Taipei, Taiwan.
Abstract
Computational
evolution experiments using the simulation Mendel’s Accountant have
suggested that deleterious mutation accumulation may pose a threat to
the long-term survival of many biological species. Contrarily,
experiments using the program Avida have suggested that purifying
selection is extremely effective and that novel genetic information can
arise via selection for high-impact beneficial mutations. The present
study shows that these approaches yield seemingly contradictory results
only because of disparate parameter settings. Both agree when similar
settings are used, and both reveal a net loss of genetic information
under biologically relevant conditions. Further, both approaches
establish the existence of three potentially prohibitive barriers to the
evolution of novel genetic information: (1) the selection threshold and
resultant genetic entropy; (2) irreducible complexity, or the waiting
time to beneficial mutation; and (3) the pressure of reductive
evolution, i.e., the selective pressure to shrink the functional genome
and disable unused functions. The adequacy of mutation and natural
selection for producing and sustaining novel genetic information cannot
be assessed without a careful study of these issues.