Radiometric dating methods have long been a target of young-earth creationists, and for good reason.
Rock ages obtained by these dating methods, usually ranging from millions to billions of years, directly contradict belief in a 6,000 year old earth. For years, the young-earth community had attempted to discredit radiometric dating by essentially claiming that very little nuclear decay has occurred since the formation of the earth.
However, this strategy began to change in 1997, when Dr. Steve Austin, Dr. John Baumgardner, Dr. Eugene Chaffin, Dr. Don DeYoung, Dr. Russell Humphreys, and Dr. Andrew Snelling, all prominent young-earth scientists, met to discuss the “problem” with radiometric dating. The ensuing eight-year research program, called RATE for Radioisotopes and the Age of The Earth, acknowledged that much larger quantities of nuclear decay have occurred in most geological processes than could be explained by an earth only a few thousand years old. In effect, young-earth creationists of the 21st century finally accepted what mainstream science had known since the early 20th century, namely that nuclear decay was the best and perhaps only viable explanation for the isotopic patterns observed in rocks and minerals today.
Conceding the occurrence of billions of years’ worth of nuclear decay created a major dilemma for people believing in a 6,000-year-old earth. The only possible solution, apart from abandoning a young-earth position altogether, was to postulate that nuclear decay rates were accelerated in the recent past. The goal of the RATE project was to find evidence to this end.
To test the hypothesis, researchers sought cases in which nuclear decay could be compared against some other natural phenomenon. Think of radioactive nuclei as a clock that ticks (i.e. decays) at a known rate. The only way to demonstrate that nuclear processes “ticked” faster in the past was to compare their decay rates to another, more accurate clock.
Most of the cases documented by the RATE team proved to be weak tests for their hypothesis. The notable exception was a helium diffusion experiment using zircon mineral samples from deep geothermal wells in Fenton Hill, New Mexico. The RATE team claimed that when they compared the nuclear decay clock with their helium diffusion clock, they found a large discrepancy. Apparently, the nuclear decay clock recorded an elapsed time of over a billion years, whereas their helium diffusion clock recorded an elapsed time of only a few thousand years. Taking the helium diffusion time as the more reliable measurement, the researchers claimed that they had found convincing evidence for accelerated nuclear decay.
However, this apparent result is not as simple as merely reading time from a stopwatch. The helium diffusion clock used by the RATE team was actually a complex mathematical model describing the process of helium diffusion from zircon crystals. One may legitimately ask, “How well did they read their diffusion clock?” After following their research for many years, I conclude that they read this clock poorly. The RATE study contained at least five specific flaws in the data analysis and modeling that were serious enough to invalidate their conclusions. Let’s focus on the two biggest errors.
First, the RATE model used a constant temperature over time. Several lines of geologic evidence indicate that the thermal history of Fenton Hill has been anything but uniform. Recent (geologically speaking) volcanic activity has raised the ground temperature at the site to over twice the typical value across the continent. These elevated temperatures have been sustained for a relatively short period of time on a geologic timescale. Therefore, the use of a constant temperature by the RATE team demonstrates a misunderstanding of the thermal history of the site. The following figure contrasts their constant temperature profile to a realistic time-dependent one.
As shown, the temperature over the last 500 million years was well below the current temperature.
The second error committed by the RATE research team was more subtle. The modeling of the helium diffusion clock required an underlying model for the helium diffusion kinetics (i.e. the manner in which temperature affects the motion of atoms). Using data from a laboratory experiment in which gas released from a zircon sample was measured at different temperatures, they extracted the parameters for a simple kinetic model. The problem with this model is that it treated all helium atoms the same, regardless of whether they were in the bulk crystal or near a defect. Most helium atoms will lie in portions of the undisturbed crystal, whereas only a small fraction will lie in the vicinity of a defect. At low temperatures, the small fraction of atoms near a defect will be mobile, whereas the vast majority of atoms will only begin to move at higher temperatures.
Essentially, there are distinct populations of helium atoms in the solid, each with different diffusion properties. Many leading scientists in the noble gas thermochronology field use more complex diffusion models that take this effect into account. Not only did the RATE researchers choose a simplistic model, but also their lack of discussion of the subject suggests that they were unaware of the existence of alternate kinetic models.
Taken together, these two errors alone prove serious enough to invalidate the helium diffusion argument for supporting accelerated nuclear decay. However, the question still remains as to whether the existing data can be reconciled with an old earth. This topic will be the subject of next week’s article.
See the RATE project reports here:
Dr. Gary H. Loechelt
Dr. Loechelt received his doctorate in the science and engineering of materials from Arizona State University in 1995, and currently works as an electrical engineer at ON Semiconductor in Phoenix, Arizona. This work was conducted on his personal time and does not reflect the views or business interests of his employer.