Chapter 4: Biology, Randomness, Chance and Purpose (Part 1)
We are reviewing the book: Is There Purpose in Biology? The Cost of Existence and the God of Love. By Denis Alexander. Chapter 4: Biology, Randomness, Chance and Purpose is up for today. Right out of the gate, Denis opens with a quote from Richard Dawkins in the preface to his book, The Blind Watchmaker:
Take, for instance, the issue of “chance”, often dramatized as blind chance. The great majority of people that attack Darwinism leap with almost unseemly eagerness to the mistaken idea that there is nothing other than random chance in it. Since living complexity embodies the very antithesis of chance, if you think that Darwinism is tantamount to chance you’ll obviously find it easy to refute Darwinism! One of my tasks will be to destroy this eagerly believed myth that Darwinism is a theory of “chance”.
It’s been my experience, in discussions about evolution with Christians, that they are shocked the arch-atheist himself would say such a thing. Why? Because Christians regularly misrepresent chance and probability in critiquing evolution. From Michael Denton (Denton, Evolution: A Theory in Crisis, p. 342.):
“Is it really credible that random processes could have constructed a reality, the smallest element of which—a functional protein or gene—is complex beyond our own creative capacities, a reality which is the very antithesis of chance, which excels in every sense anything produced by the intelligence of man?”
From Answers in Genesis
“When there is more than one possible outcome and the outcome is not predetermined, probability can become a factor. In the case of evolution there is no pre-assigned chemical arrangement of amino acids to form a protein. Therefore, the formation of a biological protein is based on random chance.”
So the first thing Alexander, to his credit, does in this chapter is define his terms. In daily, common, speech, the word “random” is often used to mean:
- Without order
- Without cause
- Without purpose
So naturally, especially using the last definition, if someone claims “evolution is a random process”, well, then it becomes purposeless by circular definition. But in this chapter Denis focuses on the mathematic and scientific understanding of “random”. In mathematics, randomness has a fairly clear meaning, although with some nuances and conditions. Mathematicians typically use the word “random” to describe processes in which multiple outcomes can occur and each is associated with a probability that gives the likelihood of that outcome. A coin toss, for example, is random in the sense that either heads or tails is an outcome and the probability of either one is 50% (or on a scale from 0 to 1 it is 0.5).
So if a string of numbers, let’s say 1-100, is random, then any single number has an equal probability of being selected. A traditional statistical approach, such as a Runs Test or Geary Test, then examines the series to see whether they display the property of randomness. A numeric sequence is said to be statistically random when it contains no recognizable patterns or regularities; sequences such as the results of an ideal dice roll or the digits of π exhibit statistical randomness.
Random processes are all around us, although when averaged together, they lead to physical properties that are highly predictable and can be described by laws. “Brownian Motion” described by Robert Brown in 1827 referred to the jiggling motion of pollen grains and chalk dust when suspended in water, due to the random motion of the water molecules. The same kind of random motion also applies to the atoms and molecules that comprise all liquids and gases. Boyle’s gas law – the pressure of a gas increases as its volume decreases, which some may remember from high school science class – depends on averaging out the random movements of trillions of gas molecules. No need to calculate the movement of each molecule separately – it is the average properties of very large numbers that count.
Alexander notes the various meanings of the word “chance” are even more slippery than the word “random”. He notes the following examples:
- “Is there any chance you can come for dinner tomorrow?” (possible availability)
- “There is a chance it might rain this afternoon and interrupt the match.” (event that depends on chaos theory)
- “I met William down at the shops today by chance.” (unexpected encounter)
- “I’m buying a ticket for the lottery even though I know my chances of winning are low.” (statistical improbability)
- “My chances of getting a first in Finals are really low.” (insufficient preparation)
Alexander assigns three main meanings of “chance” relevant to the present topic, broadly speaking:
- The first is sometimes called epistemological chance because it refers to all those events that are perfectly lawlike in how they happen, but about which we have insufficient knowledge of their antecedents to make predictions. If we knew all the antecedents involved, an incredibly complex amount of complex information that we will never know, it would be possible, in principle, to predict the outcome.
The lawlike behavior in epistemological chance is also useful because it allows precise predictions to be made about the properties of large numbers of chance events. Insurance actuarial data is the prime example of this. I might not know when I’m going to die, but life insurance companies can calculate the aggregate of the data that allows a profitable premium to be charged despite the ignorance of any particular premium-holder’s date of expiration.
- The second main type of chance we can call ontological chance, because there are no antecedents that could possibly be known that could enable a prediction, even in principle.
The classic example of ontological chance is radioactive decay. Nobody knows, given our current level of knowledge, when or why any one particular radioisotope will emit a particle of radiation energy at one moment rather than another. But again, the law of large numbers allows us to precisely calculate an accurate half-life of any radioisotope so we can accurately determine the time since that radioisotope was formed into a mineral. Ontological chance stems from quantum mechanics and the principle of quantum indeterminacy. A notable consequence of quantum indeterminism is the Heisenberg uncertainty principle, which prevents the simultaneous accurate measurement of all a particle’s properties.
Heisenberg’s uncertainty principle, is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables, such as position and momentum, can be known simultaneously. It is now known that the uncertainty principle is inherent in the properties of all wave-like systems, and that it arises in quantum mechanics simply due to the matter wave nature of all quantum objects. The uncertainty principle actually states a fundamental property of quantum systems and is not a statement about the observational success of current technology, as was once thought. Einstein hated the idea, famously claiming that “God doesn’t play dice” – which prompted Niels Bohr to respond, “Stop telling God what to do”, a wry statement on the empirical data that was verifying the principle.
- The third type of chance we might call metaphysical chance. This is the idea that chance somehow rules over everything, almost as if it were an agency or metaphysical principle.
This aspect of chance seems to me to almost resemble the Fates, the three sister goddesses that appeared in Greek and Roman mythology and were believed to have “spun out” a child’s destiny at birth. They determined when life began, when it ended, and everything in between. Consider this quote, with respect to genetic mutations, from Monod’s book, Chance and Necessity, 1997, p. 110:
We say that these events are accidental, due to chance. And since they constitute the only possible source of modifications in the genetic text, itself the sole repository of the organism’s hereditary structures, it necessarily follows that chance alone is at the source of every innovation, of all creation in the biosphere. Pure chance, absolutely free but blind, at the very root of the stupendous edifice of evolution: this central concept of modern biology is no longer one among other possible or even conceivable hypothesis. It is, today, the sole conceivable hypothesis, the only one compatible with observed and tested fact. And nothing warrants the supposition (or the hope) that conceptions about this should, or ever could, be revised.
Except, of course, the scientific conclusions on the role of chance have indeed been massively revised by the more recent scientific advances. And Monod wasn’t the first, and won’t be last, scientist to wildly extrapolate from currently understood properties to conclusion that lie well beyond science, and are really metaphysical conclusions. As Alexander says:
“Suffice it to say that Chance is not an agency and doesn’t “do” anything. Chance is simply our way of describing our own position as observers in relation to various properties of matter, no more and no less. Despite this obvious fact, it is remarkable how often the language of “Chance as agent” creeps into otherwise sober scientific and philosophical texts.”
The last term Alexander defines, before he discusses these terms and their relation to evolution, is “chaos”. In the common parlance “chaos” is used to mean “without order” i.e. traffic today was absolute chaos because of the president’s visit. The technical meaning of the term “chaos” is quite different. Chaos theory is particularly associated with the American mathematician Edward Lorenz (1917-2008).
In 1960 Lorenz created a weather-model on his computer at the Massachusetts Institute of Technology. Lorentz’ weather model consisted of an extensive array of complex formulas, some even hoped that Lorentz had built the ultimate weather-predictor and if the input parameters were chosen identical to those of the real weather, it could mimic earth’s atmosphere and be turned into a precise prediction model. But Lorenz discovered that even changing the sixth decimal place in one of the variable could dramatically change the outcome. In other words, tiny differences in the starting conditions could make a major difference in the weather patterns several weeks later. This became known as “the butterfly effect” – a butterfly flaps its wings in Brazil then it can impact the weather pattern in Texas some weeks later. So basically, the “chaos” in Chaos Theory means:
- A tiny difference in initial parameters will result in a completely different behavior of a complex system.
- The Uncertainty Principle prohibits accuracy. Therefore, the initial situation of a complex system cannot be accurately determined, and the evolution of a complex system can therefore not be accurately predicted.
Next week, we will delve into how Alexander uses these terms when he is speaking about evolutionary biology and why he does not believe Darwinian evolution is a theory of chance. I will be interested to see if our astute commentators this week can predict what Denis will say based on these definition of terms.