Radiometric dating has been used to determine the ages of the Earth, Moon, meteorites, ages of fossils, including early man, timing of glaciations, ages of mineral deposits, recurrence rates of earthquakes and volcanic eruptions, the history of reversals of Earth's magnetic field, and many of other geological events and processes.
K-Ar dating is based on measurement of the product of the radioactive decay of an isotope of potassium (K), which is a common element found in many materials, such as micas, clay minerals, tephra, and evaporites, into argon.
If any excess argon becomes trapped in the mineral during crystallization, the mineral will appear older than it actually is; if any argon is lost after the mineral crystallizes, the mineral will appear younger.
As you can imagine, the chaos of Earth systems can produce both scenarios, so geochronologists have developed techniques to verify (test) each assumption. Most people (geologists included) think of radiometric dating methods as a means to assign absolute ages to rocks/minerals.
When living things die, they stop taking in carbon-14, and the radioactive clock is "set"!
If you're reading this now, however, you might be curious to reopen that box in an effort to follow my argument as I answer the title of this post (or, if nothing else, to avoid admitting that chemistry was "not really your thing").
Radioactive elements are unstable; they breakdown spontaneously into more stable atoms over time, a process known as radioactive decay.
Radioactive decay occurs at a constant rate, specific to each radioactive isotope.
Radioactive elements were incorporated into the Earth when the Solar System formed.
All rocks and minerals contain tiny amounts of these radioactive elements.
For now, I wanted to consider an older article, only a page long, entitled "How do you date a New Zealand volcano?