Darwin's Meltdown

[COMMENT:  this is a fascinating look backward from the future, 2025 AD, to be exact, into which four scientists project themselves.  I would take it to be a good estimate of the state of affairs as things move into the 21st century.  We Christians should be hopeful about the recovery of Christian civilization.  Too many good things are happening -- which the media has no intention of telling us about.    E. Fox]

 -- cover story   http://www.worldmag.com/world/issue/04-03-04/home.asp

The view from 2025: How Design beat Darwin
Cover story: WORLD ASKED FOUR leaders of the Intelligent Design Movement to
have some fun: Imagine writing in 2025, on the 100th anniversary of the
famous Scopes "monkey" trial, and explain how Darwinism has bit the dust,
unable to rebut the evidence that what we see around us could not have
arisen merely by time plus chance.

By The Editors

WORLD ASKED FOUR leaders of the Intelligent Design Movement to have some
fun: Imagine writing in 2025, on the 100th anniversary of the famous Scopes
"monkey" trial, and explain how Darwinism has bit the dust, unable to rebut
the evidence that what we see around us could not have arisen merely by
time plus chance. Our fanciful historians are:

Phillip Johnson, WORLD's Daniel of the Year for 2003, is a law professor at
the University of California at Berkeley and the author of Darwin on Trial
(1991) and many other books, including Defeating Darwinism by Opening
Minds, Reason in the Balance, The Wedge of Truth, and The Right Questions.

Jonathan Wells, a senior fellow at the Discovery Institute and the author
of Icons of Evolution (2000), received both a Ph.D. in biology from the
University of California at Berkeley and a Ph.D. in theology from Yale
University.

Dr. Jeffrey M. Schwartz, research professor of psychiatry at the UCLA
School of Medicine, is the author of more than 100 scientific publications
in the fields of neuroscience and psychiatry. His latest book is The Mind
and the Brain (released in paperback last year).

William Dembski, associate research professor at Baylor and a senior fellow
of the Discovery Institute, received a Ph.D. in mathematics from the
University of Chicago and is the author of, among other books, The Design
Inference (1998) and The Design Revolution (2004).

=-=-=-=-

The demise of naturalism
INTELLIGENT DESIGN: Methodological naturalism used to be a regulative
principle for science and for all serious academic thought. Not any longer.
It is now (in 2025) an outdated dogma, and the Scopes trial stereotype, as
depicted in the movie Inherit the Wind, is now effectively dead

By Phillip Johnson

IN 1980, ASTRONOMER CARL SAGAN commenced the influential national public
television series Cosmos by announcing its theme: "The cosmos is all there
is, ever was, or ever will be." Sagan's mantra was spoken more than 20
years before the landmark Santorum Amendment to the Federal Education Act
of 2001 encouraged science educators to teach students to distinguish
between testable scientific theories and philosophical claims that are made
in the name of science.

In those unsophisticated pre-Santorum years, celebrity scientists like
Sagan freely promoted a dogmatic naturalistic philosophy as if it were a
fact that had been discovered by scientific investigation-just as previous
generations of celebrities had promoted racism, class warfare, and Freudian
fantasy in the name of science. The celebrities felt themselves free to
ignore both evidence and logic, because the approval of the rulers of
science, who had a vested interest in persuading the public to adopt a
philosophy that maximized their own influence, was all that was needed to
induce the media to report an ideological dogma as a scientific conclusion.

Millions of schoolchildren and credulous adults were led to accept the
voice of Sagan as the voice of science and thus to believe that scientists
had proved that God does not exist, or at least is irrelevant to our lives.
In brief, the message of this government-promoted television series was
that philosophical naturalism and science are one and the same. The series
did contain scientific information, much of it inaccurate or misleading,
but primarily it was an appeal to the imagination, promoting the worship of
science and the adventurous vision of exploring the universe.

The perennially popular Star Trek television series further conditioned the
youth of America to dream of a technological utopia in which disease and
distance were conquered and the great adventure of mankind was to explore
the many inhabited planets supposedly existing throughout the universe.
Throughout the second half of the 20th century, which we now know as the
"century of scientism," the popular media relentlessly pursued the theme
that liberation and fulfillment are to be found through technology, with
the attendant implication that the supernatural creator revealed in the
Bible is a superfluous and obsolete entity doomed to expire from terminal
irrelevance.

Social scientists further affirmed this myth with their secularization
thesis, which predicted that supernatural religion would steadily lose
adherents throughout the world as public education enlightened the
multitudes, and as people came to see scientific technology as the only
route to health, happiness, and longevity. Problems such as pollution and
warfare were acknowledged, but these too could be mastered if we insisted
that our politicians heed the advice of the ruling scientists.

The cultural path that led to this apotheosis of scientific naturalism
began just after the middle of the 20th century, with the triumphalist
Darwin Centennial Celebration in 1959 and the 1960 film Inherit the Wind, a
stunning but thoroughly fictionalized dramatization of the Scopes trial of
1925. The real Scopes trial was a publicity stunt staged by the ACLU, but
Broadway and Hollywood converted it to a morality play about religious
persecution in which the crafty criminal defense lawyer Clarence Darrow
made a monkey of the creationist politician William Jennings Bryan, and in
the process taught the moviegoing public to see Christian ministers as
ignorant oppressors and Darwinist science teachers as heroic truth-seekers.
As the 20th century came to an end, science and history teachers were still
showing Inherit the Wind to their classes as if it were a fair portrayal of
what had happened in Dayton, Tenn., in 1925.

Superficially, it seemed that scientific naturalism was everywhere
triumphant at the start of the 21st century. Scientific rationalists were
nonetheless uneasy, for two reasons.

First, literary intellectuals had pushed naturalism to the limits of its
logic and drawn the conclusion that, since an uncreated nature is
indifferent to good and evil, all values are merely subjective, including
even the value of science. It seemed to follow that nothing is forbidden,
and pleasure can be pursued without limit. Both highbrow literature and
popular entertainment became strongly nihilistic, scorning all universal
standards of truth, morality, or reason.

Second, public opinion polls showed that a clear majority of the American
public still believed that God is our creator despite the heavy-handed
indoctrination in evolutionary naturalism to which they had been subjected
for several decades in textbooks, television documentaries, and museum
exhibits. The seemingly solid wall of Darwinian orthodoxy was crumbling
under the pressures described in the accompanying article by Jonathan Wells.

Naturalism was losing its essential scientific backing, and then it also
suddenly lost its hold on the popular and literary imagination, as the
American public tired of nihilism and began to count the cost of all that
had been destroyed during the century of scientism. New historical
scholarship reflected in a stunning PBS television documentary exposed the
Inherit the Wind portrayal of the Scopes trial as a hoax, kicking off an
era of historical revisionism in which book after scholarly book exposed
how propaganda techniques had been employed to create a mythology of
inevitable progress toward naturalism, similar to the governing mythology
of the Soviet Union, which had proclaimed the inevitable replacement of
capitalism by communism.

The collapse of the Soviet Union put an end to the Soviet myth, just as the
scientific collapse of Darwinism, preceded as it was by the discrediting of
Marxism and Freudianism, prepared the way for the culture to turn aside
from the mythology of naturalism to rediscover the buried treasure that the
mythology had been concealing. A hilarious Broadway comedy titled Inherit
the Baloney enacted a sort of Scopes trial in reverse, with the hero a
courageous Christian college professor badgered incessantly by dim-witted
colleagues and deans who keep telling him that the only way to preserve his
faith in a postmodern world is to jettison all the exclusivist
truth-claims. They wanted him to admit that Jesus was sorely in need of
sensitivity training from some wise counselor like Pontius Pilate, because
"nobody can surf the web every day and still believe that there is such a
thing as 'truth' or goodness." Overnight, the tendency of naturalistic
rationalism to decay into postmodern irrationalism became a national joke.

Then the rise of Islamic extremism at the start of the new century came
just as scholars and journalists were finally taking notice of the rapid
spread of active, vibrant Christian faith in Africa, South America, and
Asia, especially China. The secularization thesis was consistent with the
facts only in a few parts of the world where long-established Christian
churches had succumbed to complacency and the slow poison of naturalism.
Where life was hardest and persecution frequent, the flame of faith burned
brighter than ever. For those with a global outlook, the question was not
whether God was still important in our lives, but rather, "What does God
want us to do?" Once Darwinism had joined Marxism and Freudianism in the
dustbin of history, the entire world seemed new and full of exciting
possibilities.

The crucial turning point in America came in the year 2004. In that year
the "same-sex marriage" explosion, abetted by public officials, brought to
public attention the extent to which long-settled understandings of law and
morality had been undermined as judges, mayors, and citizens internalized
the nihilistic moral implications of naturalistic philosophy. That same
year, with the spectacular success of two great movies, The Return of the
King and The Passion of the Christ, it became clear that the public was
hungering for art and entertainment that affirmed traditional values rather
than flouted them. Surprise: The Bible still provided, as it had for many
centuries, the indispensable starting point for the artistic imagination.

Artists and humanities scholars recognized that the human imagination had
been stunted by blind adherence to a philosophy that denied the artist or
poet any sense of the divine power that gives meaning to the realm of
nature. As sanity reasserted itself, even the secular intellectuals saw
that the fact of creation provides the essential foundation not only for
the artistic imagination, but even for the scientific imagination, because
science itself makes no sense if the scientific mind is itself no more than
the product of irrational material forces.

As that insight spread, naturalism became yesteryear's fashion in thought,
and the world moved forward to the more realistic understanding of the
human condition that we in 2025 now take for granted. Only the fool says
that there is no God, or that God has forgotten us. Folly like that is as
dead today as the discredited Inherit the Wind stereotype, which fit the
facts of history no better than the secularization thesis. We no longer
expect to meet intelligent beings on other planets, for we have learned how
uniquely fitted to shelter life our own planet has been created to be. Now
we have a much more exciting adventure. We can dedicate our minds and our
courage to sharing the truth that makes us free.

=-=-=-=-

Whatever happened to evolutionary theory?
INTELLIGENT DESIGN: Intelligent design has now (in 2025) become a thriving
scientific research program and replaced materialistic accounts of
biological evolution (in particular, Darwinism). ID theory led to new
understanding of embryo development and the importance of "junk DNA"

By Jonathan Wells

IN 1973, GENETICIST THEODOSIUS Dobzhansky wrote: "Nothing in biology makes
sense except in the light of evolution." By "evolution," he meant the
synthesis of Charles Darwin's 19th-century theory that all living things
have descended from a common ancestor through natural selection and random
variations, and the 20th-century theory that new variations are produced by
mutations in DNA. By 2000, the biological sciences had become almost
totally dominated by this view. Millions of students were taught that
Darwinian evolution was a simple fact, like gravity. Oxford professor
Richard Dawkins even proclaimed that anyone who doubted it must be
ignorant, stupid, insane, or wicked.

Now, a mere quarter of a century later, Darwinian evolution is little more
than a historical footnote in biology textbooks. Just as students learn
that scientists used to believe that the Sun moves around the Earth and
maggots are spontaneously generated in rotting meat, so students also learn
that scientists used to believe that human beings evolved through random
mutations and natural selection. How could a belief that was so influential
in 2000 become so obsolete by 2025? Whatever happened to evolutionary theory?

Surprising though it may seem, Darwinism did not collapse because it was
disproved by new evidence. (As we shall see, the evidence never really fit
it anyway.) Instead, evolutionary theory was knocked off its pedestal by
three developments in the first decade of this century-developments
centered in the United States, but worldwide in scope. Those developments
were: (1) the widespread adoption of a "teach the controversy" approach in
education, (2) a growing public awareness of the scientific weaknesses of
evolutionary theory, and (3) the rise of the more fruitful "theory of
intelligent design."

The first development was a reaction to late 20th-century efforts by
dogmatic Darwinists to make evolutionary theory the exclusive framework for
biology curricula in American public schools. Biology classrooms became
platforms for indoctrinating students in Darwinism and its underlying
philosophy of naturalism-the anti-religious view that nature is all there
is and God is an illusion. In the ensuing public backlash, some people
demanded that evolution be removed from the curriculum entirely. A larger
number of people, however, favored a "teach the controversy" approach that
presented students with the evidence against evolutionary theory as well as
the evidence for it.

The U.S. Congress implicitly endorsed this approach in its No Child Left
Behind Act of 2001. A report accompanying the legislation stated that
students should learn "to distinguish the data and testable theories of
science from religious or philosophical claims that are made in the name of
science," and that students should "understand the full range of scientific
views that exist" with regard to biological evolution. Despite loud
protests and threats of lawsuits from the Darwinists, hundreds of state and
local school boards across America had adopted a "teach the controversy"
approach by 2005.

In the second major development, students who were free to examine the
evidence for and against evolution quickly realized that the former was
surprisingly thin. Although Darwinists had long boasted about having
"overwhelming evidence" for their view, it turned out that they had no good
evidence for the theory's principal claim: that species originate through
random mutation and natural selection. Bacteria were the best place to look
for such evidence, because they reproduce quickly, their DNA can be easily
mutated, and they can be subjected to strong selection in the laboratory.
Yet bacteria had been intensively studied throughout the 20th century, and
bacteriologists had never observed the formation of a new species.

If there was no good evidence that a Darwinian mechanism could produce new
species, still less was there any evidence that a Darwinian mechanism could
produce complex organs or new anatomical features. Darwinists discounted
the problem by arguing that evolution was too slow to observe, but this
didn't change the fact that they lacked empirical confirmation for their
theory.

Of course, there was plenty of evidence for minor changes in existing
species-but nobody had ever doubted that existing species can change over
time. Domestic breeders had been observing such changes-and even producing
them-for centuries. Unfortunately, this was not the sort of evidence that
evolution needed. After all, the main point of evolutionary theory was not
how selection and mutation could change existing species, but how that
mechanism could produce new species-indeed, all species after the first-as
well as new organs and new body plans. That's why Darwin titled his magnum
opus The Origin of Species, not How Existing Species Change over Time.

A growing number of people realized that the "overwhelming evidence" for
evolutionary theory was a myth. It didn't help the Darwinists when it
became public knowledge that they had faked some of their most widely
advertised evidence. For example, they had distorted drawings of early
embryos to make them look more similar than they really are (in order to
convince students that they had descended from a common ancestor), and they
had staged photos showing peppered moths on tree trunks where they don't
normally rest (in order to persuade students of the power of natural
selection).

In the first few years of this century, the cultural dominance of Darwinism
was so strong, especially in academia, that critics were slow to speak up.
By 2009, however, when Darwin's followers had hoped to stage a triumphal
celebration of their leader's 200th birthday, millions of people were
laughing at the emperor with no clothes.

The third and perhaps most decisive development was a series of
breakthroughs in biology and medicine inspired by the new theory of
intelligent design. Everyone, even the Darwinists, agreed that living
things look as though they were designed. Darwinists insisted that this was
merely an illusion, produced by the combined action of random mutation and
natural selection; but design theorists argued that the design was real.
For years the controversy remained largely philosophical; then, in the
first decade of this century, a few researchers began applying
intelligent-design theory to solving specific biological problems.

One of these was the function of so-called "junk DNA." From a Darwinian
perspective, "genes" were thought to determine all the important
characteristics of an organism, and gene mutations were thought to provide
the raw materials for evolution. When molecular biologists in the third
quarter of the 20th century discovered that certain regions of DNA encode
proteins that determine some of the characteristics of living cells, and
equated these with "genes," Darwinists assumed that their theory was
complete. They even proclaimed DNA to be "the secret of life."

Yet molecular biologists learned in the 1970s that less than 5 percent of
human DNA encodes proteins. Darwinists immediately declared the other 95
percent "junk"-molecular accidents that had accumulated in the course of
evolution. Since few researchers were motivated (or funded) to investigate
garbage, most human DNA was neglected for decades. Although biologists
occasionally stumbled on functions for isolated pieces of "junk," they
began to make real progress only after realizing that the DNA in an
intelligently designed organism is unlikely to be 95 percent useless. The
intensive research on non-coding regions of human DNA that followed soon
led to several medically important discoveries.

Another insight from intelligent-design theory advanced our understanding
of embryo development. From a Darwinian perspective, all the information
needed for new features acquired in the course of evolution came from
genetic mutations. This implied that all essential biological information
was encoded in DNA. In contrast, intelligent-design theory implied that
organisms are irreducibly complex systems in which DNA contains only part
of the essential information. Although a few biologists had been arguing
against DNA reductionism for decades, biologists guided by
intelligent-design theory in 2010 discovered the true nature of the
information that guides embryo development.

All three of these developments-teaching the controversy, educating people
about the lack of evidence for evolutionary theory, and using
intelligent-design theory to make progress in biomedical research-were
bitterly resisted by Darwinists in the first decade of this century.
Defenders of the Darwinian faith engaged in a vicious campaign of character
assassination against their critics in the scientific community. Meanwhile,
their allies in the news media conducted a massive disinformation campaign,
aimed primarily at convincing the public that all critics of Darwinism were
religious zealots.

More and more people saw through the lies, however, and within a few short
years Darwinism had lost its scientific credibility and public funding. By
2015 it was well on its way to joining its intellectual cousins, Marxism
and Freudianism, in the dustbin of discarded ideologies. By 2020, Darwinism
was effectively dead.

=-=-=-=-

Mind transcending matter
INTELLIGENT DESIGN: For a time, cognitive neurophysiology attempted to
reduce the mind to brain function. Such a materialist reduction of mind to
brain can no longer (in 2025) be reasonably maintained. We've learned that
we can control our minds and act responsibly

By Jeffrey Schwartz

LOOKING BACK, IT SEEMS INEVITABLE that advances in brain science during the
20th century led almost all people esteemed as "scientifically literate" to
believe that eventually all aspects of the human mind would be explained in
material terms. After all, in an era when the unquestioned cultural
assumption was "for science all causes are material causes," how could one
be expected to think differently? What's more, tremendous advances in
brain-imaging technologies during the last two decades of that most
materialist of centuries enabled scientists to investigate the inner
workings of the living human brain. This certainly seemed to further
buttress the generally unexamined and often smugly held belief that the
deep mysteries of the brain, and the "laws" through which it created and
ruled all aspects of the human mind, would someday be revealed.

Thus arose the then virtually hegemonic belief that human beings and
everything they do are, like all other aspects of the world of nature, the
results of material causes-by which the elites of the time simply meant
results of material forces interacting with each other. While primitive,
uneducated, and painfully unsophisticated people might be beguiled into
believing that they had minds and wills capable of exerting effort and
rising above the realm of the merely material, this was just-as Daniel
Dennett, a widely respected philosopher of the day, delighted in putting
it-an example of a "user illusion": that is, the quaint fantasy of those
who failed to realize, due to educational deficiencies or plain
thick-headedness, that "a brain was always going to do what it was caused
to do by local mechanical disturbances." Were you one of the rubes who
believed that people are capable of making free and genuinely moral
decisions? Then of course haughty contempt, or at best pity, was the only
appropriate demeanor a member of the intellectual elite could possibly
direct your way.

On a societal and cultural level the damage such spurious and unwarranted
elite opinions wreaked on the world at large was immense. For if everything
people do results solely from their brains, and everything the brain does
results solely from material causes, then people are no different than any
other complicated machine and the brain is no different in principle than
any very complex computer. If matter determines all, everything is passive
and no one ever really does anything, or to be more precise, no one is
really responsible for anything they think, say, or do.

What's more, if anything they think, say, or do causes problems for them or
society at large, then, the sophisticates of that thankfully bygone era
believed, the ultimate way to solve the problem would be to make the
required changes in the brain that would make it work the way a properly
functioning machine is supposed to. This naturally led to the widespread
use of drugs as a primary means of treating what generally came to be
called "behavioral problems."

After all, if the brain is the final cause of everything a person thinks,
says, and does, why bother with old-fashioned and outdated notions like
"self-control" or even "making your best effort" to solve a problem? If the
brain is the ultimate cause underlying all the problems, then the
sophisticated thing to do to rectify things is to give a chemical (or even
place an electrode!) that gets right in there and fixes things. "God helps
those who help themselves?" Not in the real world, where science knows all
the answers, sneered the elites of the time.

Happily for the future of humanity, in the early years of the 21st century
this all started to change. The reasons why, on a scientific level, grew
out of the coming together of some changes in perspective that had occurred
in physics and neuroscience during the last decades of the previous
century. Specifically, the theory of physics called quantum mechanics was
seen to be closely related, especially in humans, to the discovery in brain
science called neuroplasticity: the fact that throughout the lifespan the
brain is capable of being rewired, and that in humans at least, this
rewiring could be caused directly by the action of the mind.

Work using new brain-imaging technologies of that era to study people with
a condition called obsessive-compulsive disorder (OCD) played a key role in
this development. OCD is a medical condition in which people suffer from
very bothersome and intrusive thoughts and feelings that give them the
sense that "something is wrong" in their immediate surroundings-usually the
thought or feeling that something is dirty or contaminated or needs to be
checked because it isn't quite right.

This is what is called an obsession. The problem the medical condition
causes is that although the sufferers generally know this feeling that
"something is wrong" is false and doesn't really make sense, the feeling
keeps bothering them and doesn't go away, due to a brain glitch that was
discovered using brain imaging. Sufferers often respond to these
gut-wrenching thoughts and feelings by washing, checking, straightening
things, etc., over and over again, in a desperate but futile attempt to
make things seem right. These futile repetitive acts are called compulsions.

In the 1990s it was discovered that OCD sufferers were very capable of
learning how to resist capitulating to these brain-related symptoms by
using a mental action called "mindful awareness" when confronting them. In
a nutshell, mindful awareness means using your "mind's eye" to view your
own inner life and experiences the way you would if you were standing, as
it were, outside yourself-most simply put, it means learning to use a
rational perspective when viewing your own inner experience.

When OCD patients did this, and as a result came to view the bothersome
intrusive thoughts and feelings just as medical symptoms that they had the
mental power to resist, they found they were empowered to direct their
attention in much more useful and wholesome ways by focusing on healthy
and/or work-related activities. Over several weeks, and with much mental
effort and faith in their ability to overcome the suffering, many OCD
patients were found to be capable of regularly resisting the symptoms.

This greatly strengthened their mental capacity to focus attention on
useful wholesome activities and overcome compulsive urges. The major
scientific finding that was discovered using brain imaging was that when
OCD sufferers used the power of their minds to redirect regularly their
focus of attention in wholesome ways, they literally rewired their own
brains in precisely the brain circuit that had been discovered to cause the
problem.

In the early years of the current century brain imaging was used to reveal
many similar and related findings. For instance, people with spider phobia,
or people viewing stressful or sexually arousing films, were found to be
entirely capable of using mental effort to apply mindful awareness and
"re-frame" their perspective on their experience. By so doing it was
clearly demonstrated that they could systematically change the response of
the brain to these situations and so cease being frightened, stressed, or
sexually aroused, whatever the case may be.

This latter finding was realized by some at the time to be potentially
relevant to teaching sexual abstinence strategies to adolescents-for if you
have the power to control your brain's response to sexual urges, then
practicing sexual abstinence in arousing situations will not only
strengthen your moral character; it will also increase your mental and
physical capacity to control the workings of your own brain-an extremely
wholesome and empowering act!

All this work came together when physicist Henry Stapp realized that a
basic principle of quantum mechanics, which because of the nature of the
brain at the atomic level must be used for proper understanding of the
brain's inner workings, explains how the action of the mind changes how the
brain works. A well-established mechanism called the quantum zeno effect
(QZE) readily explains how mindfully directed attention can alter brain
circuitry adaptively. Briefly, we can understand QZE like this: The mental
act of focusing attention tends to hold in place brain circuits associated
with whatever is focused on. In other words, focusing attention on your
mental experience maintains the brain state arising in association with
that experience.

If, using mindful awareness, a brain state arises associated with a
wholesome perspective, the sustained application of that mindful
perspective will literally, because of the QZE mechanism, hold in place the
brain circuitry associated with the wholesome process. Of course, the QZE
mechanism would be expected to work the same way to hold in place the
brain's response to meditation or prayer, and brain-imaging research in the
early years of this century demonstrated that to be the case.

The rest, as they say, is history. Once a solid scientific theory was in
place to explain how the mind's power to focus attention could
systematically rewire the brain, and that the language of our mental and
spiritual life is necessary to empower the mind to do so, the materialist
dogma was toppled. We may not have all lived happily ever after in any
simplistic sense, but at least science is no longer on the side of those
who claim human beings are no different in principle than a machine.

=-=-=-=-=-

The new age of information
INTELLIGENT DESIGN: A mechanistic view of science has now (in 2025) given
way to an information-theory view in which information rather than blind
material forces is primary and basic. This change has affected not only
science but worldviews

By William Dembski

AT THE TIME OF THE SCOPES TRIAL, and for the remainder of the 20th century,
science was wedded to a materialistic conception of nature. The architects
of modern science, from Rene Descartes to Isaac Newton, had proposed a
world of unthinking material objects ruled by natural laws. Because these
scientists were theists, the rule of natural law was for them not
inviolable--God could, and from time to time did, invade the natural order,
rearrange material objects, and even produce miracles of religious
significance. But such divine acts were gratuitous insertions into a
material world that was capable of carrying on quite nicely by itself.

In the end, the world bequeathed to us by modern science became a world of
unthinking material objects ruled by unbroken natural laws. With such a
world, God did not, and indeed could not, interact coherently, much less
intervene. Darwinian evolution, with its rejection of design and its
unwavering commitment to purely material forces (such as natural
selection), came to epitomize this materialist conception of science. If
God played any role in the natural world, human inquiry could reveal
nothing about it.

This materialist conception of the world came under pressure in the 1990s.
Scientists started asking whether information might not be the fundamental
entity underlying physical reality. For instance, mathematician Keith
Devlin mused whether information could perhaps be regarded as "a basic
property of the universe, alongside matter and energy (and being ultimately
interconvertible with them)." Origin-of-life researchers like Manfred Eigen
increasingly saw the problem of the origin of life as the problem of
generating biologically significant information. And physicist Paul Davies
speculated about information replacing matter as the "primary stuff,"
therewith envisioning the resolution of age-old problems, such as the
mind-body problem. Thus he remarked, "If matter turns out to be a form of
organized information, then consciousness may not be so mysterious after all."

Such speculations became serious scientific proposals in the first decade
of this century as proponents of intelligent design increasingly clashed
with Darwinian evolutionists. The irony here is that the very sorts of
arguments that Darwinists had been using to try to discredit intelligent
design and relegate it to the sphere of religion rather than science ended
up discrediting Darwinian evolution itself and exposing its unscientific
presuppositions.

To see how this happened, recall how exchanges between Darwinists and the
early design theorists used to go. The design theorists would go to great
lengths to analyze a given biological structure, show why it constituted an
obstacle to Darwinian and other materialistic forms of evolution, and lay
out how the structure in question exhibited clear marks of intelligence. To
such carefully drawn lines of scientific argument and evidence, the
Darwinist invariably offered stock responses, such as, "There you go with
your religion again" "You're just substituting supernatural causes for
natural causes" "You just haven't figured out how evolution did it" "You're
arguing from ignorance" "You're lazy; get back in the lab and figure out
how evolution did it."

These responses were effective at cowing critics of Darwinism so long as
the scientific community agreed with the Darwinists that science was about
understanding the natural world solely in terms of unguided material
processes or mechanisms. But in the first decade of this century it became
clear that this definition of science no longer worked. Science is, to be
sure, about understanding the natural world. But science is not about
understanding the natural world solely in terms of material processes.

The problem is that material processes, as understood by the Darwinists and
most of the scientific community at the time, could not adequately explain
the origin of biologically significant information. Darwinist Michael Ruse
saw the problem clearly, though without appreciating its significance.
Describing the state of origin-of-life research at the turn of the century,
he remarked: "At the moment, the hand of human design and intention hangs
heavily over everything, but work is going forward rapidly to create
conditions in which molecules can make the right and needed steps without
constant outside help. When that happens, ... the dreaming stops and the
fun begins."

Sadly for the Darwinists, the dreaming never stopped and the fun never
began. Instead, the work of theoretical and applied intelligent-design
theorists went forward and showed why scientific explanations of
biologically significant information could never remove the hand of design
and intentionality. The watchword for science became information requires
intelligence. This came to be known as the No Free Lunch Principle, which
states that apart from intelligent guidance, material processes cannot
bring about the information required for biological complexity.

The No Free Lunch Principle led to a massive change in scientific
perspective. One notable consequence for biology was a thoroughgoing
reevaluation of experimental work on prebiotic and biotic evolution.
Invariably, where evolutionary biologists reported interesting experimental
results, it was because "intelligent investigators" had "intervened" and
performed "experimental manipulations" that nature, left to its own
devices, was utterly incapable of reproducing.

This led to an interesting twist. Whereas Darwinists had been relentless in
disparaging intelligent design as a pseudoscience, Darwinism itself now
came to be viewed as a pseudoscience. Intelligent design had been viewed as
a pseudoscience because it refused to limit nature to the operation of
blind material processes. Once it became clear, however, that material
processes were inherently inadequate for producing biologically significant
information, the Darwinian reliance, and indeed insistence, on such
processes came to be viewed as itself pseudoscientific.

What would you think of a chemist who thought that all explosives were like
TNT in that their explosive properties had to be explained in terms of
electrostatic chemical reactions? How would such a chemist explain the
explosion of a nuclear bomb? Would this chemist be acting as a scientist in
requiring that nuclear explosions be explained in terms of electrostatic
chemical reactions rather than in terms of fission and fusion of atomic
nuclei? Obviously not.

Scientific explanations need to invoke causal powers that are adequate to
account for the effects in question. By refusing to employ intelligence in
understanding biologically significant information, the Darwinian
biologists were essentially like this chemist, limiting themselves to
causal powers that were inherently inadequate for explaining the things
they were trying to explain. No wonder Darwinism is nowadays considered a
pseudoscience. It does not possess, and indeed self-consciously rejects,
the conceptual resources needed to explain the origin of biological
information. Some historians of science are now even going so far as to
call Darwinism the greatest swindle in the history of ideas. But this is
perhaps too extreme.

The information-theoretic perspective did not just come to govern biology
but took hold throughout the natural sciences. Physics from the time of
Newton had sought to understand the physical world by positing certain
fundamental entities (particles, fields, strings), specifying the general
form of the equations to characterize those entities, prescribing initial
and boundary conditions for those equations, and then solving them. Often,
these were equations of motion that, on the basis of past states, predicted
future states. Within this classical conception of physics, the holy grail
was to formulate a "theory of everything"-a set of equations that could
characterize the constitution and dynamics of the universe at all levels of
analysis.

But with information as the fundamental entity of science, this conception
of physics gave way. No longer was the physical world to be understood by
identifying an underlying structure that has to obey certain equations no
matter what. Instead, the world came to be seen as a nested hierarchy of
systems that convey information, and the job of physical theory was to
extract as much information from these systems as possible. Thus, rather
than see the scientist as Procrustes, forcing nature to conform to
preconceived theories, this informational approach turned the scientist
into an inquirer who asks nature questions, obtains answers, but must
always remain open to the possibility that nature has deeper levels of
information to divulge.

Nothing of substance from the previous "mechanistic science" was lost with
this informational approach. As Roy Frieden had shown, the full range of
physics could be recovered within this informational approach (Physics from
Fisher Information: A Unification, Cambridge University Press, 1998). The
one thing that did give way, however, was the idea that physics is a
bottom-up affair in which knowledge of a system's parts determines
knowledge of the system as a whole. Within the informational approach, the
whole was always truly greater than the sum of its parts, for the whole
could communicate information that none of the parts could individually.

The primacy of information throughout the sciences has had profound
consequences for religion and faith. A world in which information is not
primary is a world seriously limited in what it can reveal about God. This
became evident with the rise of modern science-the world it gave us
revealed nothing about God except that God, if God exists at all, is a
lawgiver. But with information as the primary stuff, there are no limits on
what the world can in principle reveal about God. Theists of all stripes
have therefore found this newfound focus of science on information refreshing.

 

* * * * * * * * * * * * * * * *

Go to: => TOP Page;  => Science Library;  => Apologetics Library;  => ROAD MAP