Chronometry

His first sketch of an evolutionary tree – from his First Notebook on the Transmutation of Species
The notes say ‘I think … case must be that one generation should have as many living as now. To do this and to have as many species in the same genus (as is) requires extinction. Thus between A + B the an immense gap in relation. C + B the finest gradation. B+D rather greater distinction. Thus genera would be formed. Bearing relation (next page) to ancient types with several extinct forms.
We are now able to date with increasing accuracy the major branches of evolutionary history.
Introduction – History of Chronometry
Chronometry, the science of measuring time and dating events, plays a crucial role in various disciplines such as archaeology, geology, astronomy, and history. The ability to accurately determine the age of events is essential for understanding the progression of Earth’s history and the timeline of human civilization. In recent years, there has been a significant increase in the accuracy of dating methods, leading to groundbreaking discoveries and a more comprehensive understanding of the past.
One of the most important advancements in chronometry has been the development of radiometric dating techniques. Radiometric dating relies on the principle of radioactive decay, where unstable isotopes of elements decay into more stable forms at a known rate. By measuring the ratio of parent to daughter isotopes in a sample, scientists can calculate the age of the sample with a high degree of accuracy. Radiometric dating has revolutionized the field of geology, providing precise dates for the formation of rocks and minerals, as well as the age of the Earth itself.
Another key development in chronometry has been the refinement of dating methods in archaeology. In the past, archaeologists relied on relative dating techniques such as stratigraphy and typology to estimate the age of artifacts and archaeological sites. While these methods were useful for placing events in a relative timeline, they lacked precision in terms of absolute dates. However, with the advent of radiocarbon dating, archaeologists were able to obtain precise dates for organic materials such as bone, charcoal, and plant remains, revolutionizing the study of ancient civilizations.
In recent years, advances in technology have further improved the accuracy of dating methods across various disciplines. For example, the development of high-precision mass spectrometry techniques has allowed scientists to analyze samples with unprecedented sensitivity and accuracy. This has led to a more precise determination of isotopic ratios in samples, resulting in more accurate age estimates. Additionally, improvements in data analysis algorithms and statistical methods have enhanced the reliability of dating results, reducing uncertainties and errors in age determinations.
The increased accuracy of dating methods has had profound implications for our understanding of Earth’s history and the timeline of human civilization. In the field of geology, precise dating of rocks and minerals has provided valuable insights into the timing of major geological events such as volcanic eruptions, earthquakes, and mountain formation. This has helped geologists reconstruct the geologic history of specific regions and understand the processes that have shaped Earth’s surface over millions of years.
In archaeology, accurate dating has allowed researchers to establish chronological sequences of ancient civilizations and track changes in material culture over time. Radiocarbon dating, in particular, has proven to be a powerful tool for determining the age of artifacts and archaeological sites, shedding light on the development of human societies and the interactions between different cultural groups. By combining radiocarbon dating with other methods such as dendrochronology (tree-ring dating) and luminescence dating, archaeologists can construct detailed timelines of past events with a high degree of precision.
In astronomy, chronometry plays a crucial role in dating celestial events such as the formation of stars, galaxies, and planets. By analyzing the light emitted by distant objects in the universe, astronomers can determine the age of these objects and the timing of key cosmic events. Advances in observational techniques and data analysis have enabled astronomers to probe the origins of the universe and unravel the mysteries of its evolution with unprecedented accuracy.
Overall, the recent increase in accuracy of dating methods has opened up new avenues for research and discovery across a wide range of disciplines. By providing precise dates for events in Earth’s history and the timeline of human civilization, chronometry has deepened our understanding of the past and shed light on the forces that have shaped our world. As technology continues to advance and new dating techniques are developed, we can expect even greater insights into the mysteries of time and the events that have defined our planet.
The Chronometric Revolution
Three centuries ago our collective human knowledge of ancient history – meaning the period to about 1000 BCE – was restricted to four written sources: the Indian Vedas, the Chinese Five Classics, the Hebrew Bible and the Greek poet Homer.
The Bible, it was assumed, presented literal historical truth, and with little science to prove otherwise, the claim by Archbishop of Ireland James Ussher (1581-1656) that the world was created in 4004 BCE was widely believed.
Archaeology changed all this when French linguists in the 1820s unlocked the secrets of Egyptian hieroglyphics, taking (Western) history back to about 3000 BCE. British scholars of the 1840s then mastered the Old Persian, Assyrian, and Babylonian languages of Mesopotamia. The geological evidence of rocks, which was dismissed by religious belief, became scientifically accepted in the stratigraphy of the 1920s. The 19th century had unearthed pre-classical civilizations and the investigation of human evolution. There has, since WWII, been a little-acknowledged Chronometric Revolution as scientific technology has expanded our biologically given senses at both the macro- and micro-scales.
‘. . . at the end of the nineteenth century it was still impossible to assign reliable absolute dates to any events before the appearance of the first written records’ but ‘There now exist no serious intellectual or scientific or philosophical barriers to a broad unification of historical scholarship’.
The Chronometric Revolution over the last few decades has allowed us to date the age of the universe, individual rocks and fossils, archaeological remains, and the divergence of lineages in biological evolution.
In 1905 Ernest Rutherford pioneered the study of radioactive decay over time in what would subsequently develop, after 1945, into what we now call radiometric dating. Geochronology, the dating of rocks, uses the constant rate of decay of radioactive impurities for dating. The half-life of radioactive isotope carbon fourteen (C14) is relatively short at 5,730 years and this is reliable for dating organic remains up to 50,000 years old which conveniently covers a large portion of the time during which anatomically modern humans migrated across the globe.
Willard Libby pioneered radiocarbon dating in the 1940s and received Nobel prize for this work in 1960. This has transformed the study of archaeology. Accelerator mass spectrometry extending this period to 80,000 years, and this time span was expanded again with the advent of thermos-luminescence which can date objects over a period of several hundred years.
Potassium/Argon dating is used to measure the age of the earth and therefore an invaluable tool for geology.
Since the elucidation of the structure of DNA by Watson and Crick in 1953 genetic analysis has vastly improved such that we can now date with increasing accuracy the times of evolutionary divergence of branches on the tree of evolution.
The discovery of cosmic background radiation in 1964 has enabled us to date the current age of the universe at 13.8 billion years.
The psychological and intellectual leap that has occurred since the 19th century is vast moving from a ‘biblical framework’ to one in which:
. . . the universe began 13.72 billion years ago, life 4 billion years ago, hominins 6 million years ago, Homo 2 million years ago, anatomically modern humans c. 300 kya, behaviourally modern humans (about) 100 kya, settled agriculture 10 kya, cities 5 kya – none of these formerly known or even guessed.[7]
Timeline of Chronometry
c. 1600 Archbishop of Ireland James Ussher (1581-1656) dates the Creation to around 6 pm on 22 October 4004 BCE according to the proleptic Julian Calendar
1800 – the dating of the collective human knowledge of ancient history is restricted to, essentially, four written sources: the Indian Vedas, the Chinese Five Classics, the Hebrew Bible, and the Greek poet Homer.
1820s – Egyptian hieroglyphics deciphered taking known history back to about 3000 BCE
1840s – British scholars master Old Persian, Assyrian, and Babylonian languages of Mesopotamia
1900 – still impossible to assign reliable dates to any events prior to the written record
1905 – Ernest Rutherford pioneers radioactive decay
1939 – Americans L.W. Alvarez and Robert Cornog first use an accelerator mass spectrometer (AMS) in a cyclotron to demonstrate that 3He was stable and that the other mass-3 isotope, tritium (3H), was radioactive. AMS uses smaller sample sizes (c. 50 mg) and has become an advance on radiocarbon dating covering samples ranging from around 80,000 to 100 years old
1940s – radiocarbon dating pioneered by Willard Libby (receives Nobel Prize in 1960). This transforms archaeology, allowing key transitions in prehistory to be dated and applied as geochronology, the dating of rocks, using the constant rate of decay of radioactive impurities e.g.the last ice age and commencement of the Neolithic and Bronze Age in different regions. The half-life of radioactive isotope carbon fourteen (C14) is relatively short at 5,730 years and this is reliable for dating organic remains up to 50,000 years old (conveniently covers much of the period when anatomically modern humans migrated across the globe).
1962 – Émile Zuckerkandl and Linus Pauling notice that the number of amino acid differences in hemoglobin between different lineages changes roughly linearly with time, as estimated from fossil evidence. They generalized this observation to assert that the rate of evolutionary change of any specified protein was approximately constant over time and over different lineages (known as the molecular clock hypothesis, gene clock, or evolutionary clock). It is an important tool in molecular systematics used to determine scientific classifications and study variation in selective forces. Knowledge of rates of molecular evolution in particular lineages also helps estimate the dates of phylogenetic events, including those not documented by fossils. However, as yet it is limited and estimates may vary by 50% or more.
1964 – discovery of cosmic background radiation enables dating age of the universe to 13.8 billion years
1967 – A.C. Wilson promotes the idea of a ‘molecular clock’ – a technique that uses the mutation rate of biomolecules (usually nucleotide sequences for DNA, RNA, or amino acid sequences for proteins) to deduce the time in prehistory when two or more life forms (lineages) diverged. Essentially a way of dating the branching points of the evolutionary tree
1969 – molecular clocking applied to anthropoid evolution. V. Sarich and A.C. Wilson find albumin and hemoglobin have comparable rates of evolution, indicating chimps and humans split about 4 to 5 million years ago
molecular anthropology has been extremely useful in establishing the evolutionary tree of humans and other primates
1970s – mitochondrial DNA became an area of research in phylogenetics in the late 1970s. Unlike genomic DNA, it offered advantages in that it did not undergo recombination.