Coming of Age in the Milky Way (37 page)

Read Coming of Age in the Milky Way Online

Authors: Timothy Ferris

Tags: #Science, #Philosophy, #Space and time, #Cosmology, #Science - History, #Astronomy, #Metaphysics, #History

BOOK: Coming of Age in the Milky Way
5.85Mb size Format: txt, pdf, ePub

At McGill University in Montreal, the energetic experimentalist Ernest Rutherford, a great bear of a man whose roaring voice sent his assistants and their laboratory glassware trembling, found that radioactive materials can produce surprisingly large amounts of energy. A lump of radium, Rutherford established, generates enough heat to melt its weight in ice every hour, and can continue to do so for a thousand years or more. Other radioactive elements last even longer; some keep ticking away at an almost undiminished rate for billions of years.

This, then, was the answer to Kelvin, and one that spelled deliverance for the late Charles Darwin: The earth stays warm because it is heated by radioactive elements in the rocks and molten core of the globe. As Rutherford wrote:

The discovery of the radioactive elements, which in their disintegration liberate enormous amounts of energy, thus increases the possible limit of the duration of life on this planet, and allows the time claimed by the geologist and biologist for the process of evolution.
40

 

Understandably pleased with this conclusion, the young Rutherford rose to address a meeting of the Royal Institution, only to find
himself confronted by the one scientist in the world his paper could most deeply offend:

I came into the room, which was half dark, and presently spotted Lord Kelvin in the audience and realized that I was in for trouble at the last part of my speech dealing with the age of the earth, where my views conflicted with his. To my relief, Kelvin fell fast asleep, but as I came to the important point, I saw the old bird sit up, open an eye and cock a baleful glance at me! Then a sudden inspiration came, and I said Lord Kelvin had limited the age of the earth, provided no new source [of energy] was discovered. That prophetic utterance refers to what we are now considering tonight, radium! Behold! the old boy beamed upon me.
41

 

Radioactive materials not only testified to the antiquity of the earth, but provided a way of measuring it as well. Rutherford’s biographer A. S. Eve recounts an exchange that signaled this new insight:

About this time Rutherford, walking in the Campus with a small black rock in his hand, met the Professor of Geology. “Adams,” he said, “how old is the earth supposed to be?” The answer was that various methods lead to an estimate of one hundred million years. “I
know”
said Rutherford quietly, “that this piece of pitchblende is seven hundred million years old.”
42

 

What Rutherford had done was to determine the rate at which the radioactive radium and uranium in the rock gave off what he called alpha particles, which are the nuclei of helium atoms, and then to measure the amount of helium in the rock. The result, seven hundred million years, constituted a reasonably reliable estimate of how long the radioactive materials had been in there, emitting helium.

Rutherford had taken a first step toward the science of radiometric dating. Every radioactive substance has a characteristic half-life, during which time half of the atoms in any given sample of that element will decay into another, lighter element. By comparing the abundance of the original (or “parent”) isotope with that of the decay product (or “daughter”), it is possible to age-date the stone or arrowhead or bone that contains the parent and daughter isotopes.

Carbon-14 is especially useful in this regard, since every living thing on Earth contains carbon. The half-life of carbon-14 is 5,570 years, meaning that after 5,570 years half of the carbon-14 atoms in any given sample will have decayed into atoms of nitrogen-14. If we examine, say, the remains of a Navaho campfire and find that half the carbon-14 in the charred remains of the burnt logs has decayed into nitrogen-14, we can conclude that the fire was built 5,570 years ago. If three quarters of the carbon has turned to nitrogen, then the logs are twice as old—11,140 years—and so forth. After about five half-lives the amount of remaining parent isotope generally has become too scanty to be measured reliably, but geologists have recourse to other, more long-lived radioactive elements. Uranium-238, for one, has a half-life of over 4 billion years, while the half-life of rubidium-87 is a methuselian 47 billion years.

In practice, radiometric dating is a subtle process, fraught with potential error. First one has to ascertain when the clock started. In the case of carbon-14, this is usually when the living tissue that contained it died. Carbon-14 is constantly being produced by the collision of high-energy subatomic particles from space with atoms in the earth’s upper atmosphere. Living plants and animals ingest carbon-14, along with other forms of carbon, only so long as they live. The scientist who comes along years later to age-date their remains is, therefore, reading a clock that started when the host died. The reliability of the process depends upon the assumption that the amount of ambient carbon-14 in the enviroment at the time was roughly the same as it is today. If not—if, for instance, a storm of subatomic particles from space happened to increase the amount of carbon-14 around thousands of years ago—then the radiometric date will be less accurate. In the case of inorganic materials, one may be dealing with radioactive atoms older than the earth itself; their clocks may have started with the explosion of a star that died when the sun was but a gleam in a nebular eye. But if such intricacies complicate the process of radiometric age-dating they also hint at the extraordinary range of its potential applications, in fields ranging from geology and geophysics to astrophysics and cosmology.

The process of radiometrically age-dating geological strata got under way only ten years after the discovery of radioactivity itself, when the young British geologist Arthur Holmes, in his book
The Age of the Earth
, correlated the ages of uranium-bearing igneous
rocks with those of adjacent fossil-bearing sedimentary strata. By the 1920s it was becoming generally accepted by geologists, physicists, and astronomers that the earth is billions of years old and that radiometric dating presents a reliable way of measuring its age. Since then, ancient rocks in southwestern Greenland have been radiometrically age-dated at 3.7 billion years, meaning that the crust of the earth can be no younger than that. Presumably the planet is older still, having taken time to cool from a molten ball and form a crust. Moon rocks collected by the Apollo astronauts are found to be nearly 4.6 billion years old, about the same age as meteorites—chunks of rock that once were adrift in space and since have been swept up by the earth in its orbit around the sun. It is upon this basis that scientists generally declare the solar system to be some 5 billion years old, a finding that fits well with the conclusions of astrophysicists that the sun is a normal star about halfway through a 10-billion-year lifetime.

When nuclear fission, the production of energy by splitting nuclei, was detailed by the German chemists Otto Hahn and Fritz Strassmann in 1938, and nuclear fusion, which releases energy by combining nuclei, was identified by the American physicist Hans Bethe the following year, humankind could at last behold the mechanism that powers the sun and the other stars. In the general flush of triumph, few paid attention to the dismaying possibility that such overwhelming power might be set loose with violent intent on the little earth. Einstein, for one, assumed that it would be impossible to make a fission bomb; he compared the problem of inducing a chain reaction to trying to shoot birds at night in a place where there are very few birds. He lived to learn that he was wrong. The first fission (or “atomic”) bomb was detonated in New Mexico on July 16, 1945, and two more were dropped on the cities of Hiroshima and Nagasaki a few weeks later. The first fusion (or “hydrogen”) bomb, so powerful that it employed a fission weapon as but its detonator, was exploded in the Marshall Islands on November 1, 1952.

A few pessimists had been able to peer ahead into the gloom of the nuclear future, though their words went largely unheeded at the time. Pierre Curie had warned of the potential hazards of nuclear weapons as early as 1903. “It is conceivable that radium in criminal hands may become very dangerous,” said Curie, accepting
the Nobel Prize.
*
“… Explosives of great power have allowed men to do some admirable works. They are also a terrible means of destruction in the hands of the great criminals who lead nations to war.”
43
Arthur Stanley Eddington, guessing that the release of nuclear energy was what powered the stars, wrote in 1919 that “it seems to bring a little nearer to fulfillment our dream of controlling this latent power for the well-being of the human race—or for its suicide.”
44
These and many later admonitions notwithstanding, the industrialized nations set about building bombs just as rapidly as they could, and by the late 1980s there were over fifty thousand nuclear weapons in a world that had grown older if little wiser. Studies indicated that the detonation of as few as 1 percent of these warheads would reduce the combatant societies to “medieval” levels, and that climatic effects following a not much larger exchange could lead to global famine and the potential extinction of the human species. The studies were widely publicized, but years passed and the strategic arsenals were not reduced.

It was through the efforts of the bomb builders that Darwin’s century-old theory of the origin of coral atolls was at last confirmed. Soon after World War II, geologists using tough new drilling bits bored nearly a mile down into the coral of Eniwetok Atoll and came up with volcanic rock, just as Darwin had predicted. The geologists’ mission, however, had nothing to do with evolution. Their purpose was to determine the structure and strength of the atoll before destroying it, in a test of the first hydrogen bomb. When the bomb was detonated, its fireball vaporized the island on which it had been placed, tore a crater in the ocean floor two miles deep, and sent a cloud of freshly minted radioactive atoms wafting across the paradisiacal islands downwind. President Truman in his final State of the Union message declared that “the war of the future would be one in which Man could extinguish millions of lives at one blow, wipe out the cultural achievements of the past, and destroy the very structure of civilization.

“Such a war is not a possible policy for rational men,” Truman added.
45
Nonetheless, each of the next five presidents who succeeded
him in office found it advisable to threaten the Soviets with the use of nuclear weapons. As the British physicist P.M. S. Blackett observed, “Once a nation pledges its safety to an absolute weapon, it becomes emotionally essential to believe in an absolute enemy.”
46

Einstein, sad-eyed student of human tragedy, closed the circle of evolution, thermodynamics, and nuclear fusion in a single sentence. “Man,” he said, “grows cold faster than the planet he inhabits.”
47

*
Though Darwin, echoing Newton, characterized much of his research as purely inductive—“I worked on true Baconian principles,” he said of his account of evolution, “and without any theory collected facts on a wholesale scale”—this has always been a difficult claim to justify scrupulously, and Darwin formulated his theory of coral atoll formation while still in South America, before he ever laid eyes on a real atoll.

*
The rise in animal breeding was spurred on by the growing industrialization of England, which brought working people in from the country, where they could keep a few barnyard animals of their own, to the cities, where they were fed from ever larger herds bred to maximize profits. More generally, the advent of Darwinism itself might be said to have been fostered by a certain distancing of human beings from the creatures they studied; it was only once people stopped cohabiting with animals that they began to entertain the idea that they were the animals’ relations.

*
Malthus, incidentally, appears to have been inspired in part by reading Darwin’s grandfather Erasmus. It’s a small world, or was so in Victorian England.

*
A striking example of adaptive color change occurred among British peppered moths in the vicinity of Manchester. In the eighteenth century, all such moths collected were pallid in color; in 1849 a single black moth was caught in the vicinity, and by the 1880s the black moths were in the majority. Why? Because industrial pollution had blackened tree trunks in the vicinity, robbing the original moths of their camouflage while bestowing its benefits upon the few black moths there. Once pollution-control ordinances came into effect, the soot slowly washed from the tree trunks and the pale peppered moth population rebounded.

*
It had been Wallace’s misfortune, however, to lose his specimens in a fire at sea. Watching from an open lifeboat as the blazing ship sank beneath the waves, Wallace recalled, “I began to feel the greatness of my loss. … I had not one specimen to illustrate the unknown lands I had trod, or to call back the recollection of the wild scenes I had beheld! But such regrets were vain … and I tried to occupy myself with the state of things which actually existed.”
24

*
Readers who tire of the details they encounter in the
Origin
may take comfort in considering that until he was interrupted by Wallace’s letter, Darwin had intended to include a great many more of them. “To treat this subject properly, a long catalogue of dry facts ought to be given,” he wrote, in Chapter Two of the
Origin
, “but these I shall reserve for a future work.”
28
He kept this promise in his exhaustive, not to say exhausting, book
The Variation of Animals and Plants Under Domestication
.

*
Not long before Röntgen’s discovery, Frederick Smith at Oxford was informed by an assistant that photographic plates stored near a cathode-ray tube were being fogged; but Smith, rather than pondering the matter, simply ordered that the plates be kept somewhere else.

*
Curie’s wife Marie, winner of two Nobel Prizes, died of the effects of radiation contracted in years of experimental research into radioactive isotopes. Her laboratory apparatus and even her cookbooks at home, inspected fifty years later, were found to be contaminated by lethal radiation.

Other books

Emily's Reasons Why Not by Carrie Gerlach
Under the Eye of God by David Gerrold
Jennifer Roberson by Lady of the Glen
The Devil You Know by K. J. Parker
The Revisionists by Thomas Mullen
The Black Opal by Victoria Holt