When the skies are clear and the Moon is not too bright, the Reverend Robert Evans, a quiet and cheerful man, lugs a bulky telescope onto the back sun-deck of his home in the Blue Mountains of Australia, about 80 kilometres west of Sydney, and does an extraordinary thing. He looks deep into the past and finds dying stars.
Looking into the past is, of course, the easy part. Glance at the night sky and what you see is history and lots of it—not the stars as they are now but as they were when their light left them. For all we know, the North Star, our faithful companion, might actually have burned out last January or in 1854 or at any time since the early fourteenth century and news of it just hasn’t reached us yet. The best we can say—can ever say—is that it was still burning on this date 680 years ago. Stars die all the time. What Bob Evans does better than anyone else who has ever tried is spot these moments of celestial farewell.
By day, Evans is a kindly and now semi-retired minister in the Uniting Church in Australia, who does a bit of locum work and researches the history of nineteenth-century religious movements. But by night he is, in his unassuming way, a titan of the skies. He hunts supernovae.
A supernova occurs when a giant star, one much bigger than our own Sun, collapses and then spectacularly explodes, releasing in an instant the energy of a hundred billion suns, burning for a time more brightly than all the stars in its galaxy. “It’s like a trillion hydrogen bombs going off at once,” says Evans. If a supernova explosion happened in our corner of the cosmos, we would be goners, according to Evans—“it would wreck the show,” as he cheerfully puts it. But the universe is vast and supernovae are normally much too far away to harm us. In fact, most are so unimaginably distant that their light reaches us as no more than the faintest twinkle. For the month or so that they are visible, all that distinguishes them from the other stars in the sky is that they occupy a point of space that wasn’t filled before. It is these anomalous, very occasional pricks in the crowded dome of the night sky that the Reverend Evans finds.
To understand what a feat this is, imagine a standard dining-room table covered in a black tablecloth and throwing a handful of salt across it. The scattered grains can be thought of as a galaxy. Now imagine fifteen hundred more tables like the first one—enough to make a single line two miles long—each with a random array of salt across it. Now add one grain of salt to any table and let Bob Evans walk among them. At a glance he will spot it. That grain of salt is the supernova.
The staggering distance between us and the stars means that what we see when we look at the night sky is the stars not as they are now, but as they were dozens or hundreds or even thousands of years ago when the light now reaching us left them. (credit 3.2)
Evans’s is a talent so exceptional that Oliver Sacks, in An Anthropologist on Mars, devotes a passage to him in a chapter on autistic savants—quickly adding that “there is no suggestion that he is autistic.” Evans, who has not met Sacks, laughs at the suggestion that he might be either autistic or a savant, but he is powerless to explain quite where his talent comes from.
“I just seem to have a knack for memorizing star fields,” he told me, with a frankly apologetic look, when I visited him and his wife, Elaine, in their picture-book bungalow on a tranquil edge of the village of Hazelbrook, out where Sydney finally ends and the boundless Australian bush begins. “I’m not particularly good at other things,” he added. “I don’t remember names well.”
“Or where he’s put things,” called Elaine from the kitchen.
He nodded frankly again and grinned, then asked me if I’d like to see his telescope. I had imagined that Evans would have a proper observatory in his back yard—a scaled-down version of a Mount Wilson or Palomar, with a sliding domed roof and a mechanized chair that would be a pleasure to manoeuvre. In fact, he led me not outside but to a crowded storeroom off the kitchen where he keeps his books and papers and where his telescope—a white cylinder that is about the size and shape of a household hot-water tank—rests in a home-made, swivelling plywood mount. When he wishes to observe, he carries them, in two trips, to a small sun-deck off the kitchen. Between the overhang of the roof and the feathery tops of eucalyptus trees growing up from the slope below, he has only a letterbox view of the sky, but he says it is more than good enough for his purposes. And there, when the skies are clear and the Moon is not too bright, he finds his supernovae.
The term supernova was coined in the 1930s by a memorably odd astrophysicist named Fritz Zwicky. Born in Bulgaria and raised in Switzerland, Zwicky came to the California Institute of Technology in the 1920s and there at once distinguished himself by his abrasive personality and erratic talents. He didn’t seem to be outstandingly bright, and many of his colleagues considered him little more than “an irritating buffoon.” A fitness fanatic, he would often drop to the floor of the Caltech dining hall or some other public area and do one-armed push-ups to demonstrate his virility to anyone who seemed inclined to doubt it. He was notoriously aggressive, his manner eventually becoming so intimidating that his closest collaborator, a gentle man named Walter Baade, refused to be left alone with him. Among other things, Zwicky accused Baade, who was German, of being a Nazi, which he was not. On at least one occasion Zwicky threatened to kill Baade, who worked up the hill at the Mount Wilson Observatory, if he saw him on the Caltech campus.
A supernova occurs when a giant star, one much bigger than our Sun, collapses and then spectacularly explodes, releasing in an instant the energy of a hundred billion stars.
If a nearby star were to explode in a supernova, such as the one photographed here in 1987, the passing blast could easily wipe out life on Earth. Fortunately, supernovae are fairly rare and—so far—safely distant. (credit 3.3)
But Zwicky was also capable of insights of the most startling brilliance. In the early 1930s he turned his attention to a question that had long troubled astronomers: the appearance in the sky of occasional unexplained points of light, new stars. Improbably, he wondered if the neutron—the subatomic particle that had just been discovered in England by James Chadwick, and was thus both novel and rather fashionable—might be at the heart of things. It occurred to him that if a star collapsed to the sort of densities found in the core of atoms, the result would be an unimaginably compacted core. Atoms would literally be crushed together, their electrons forced into the nucleus, forming neutrons. You would have a neutron star. Imagine a million really weighty cannonballs squeezed down to the size of a marble and—well, you’re still not even close. The core of a neutron star is so dense that a single spoonful of matter from it would weigh 90 billion kilograms. A spoonful! But there was more. Zwicky realized that after the collapse of such a star there would be a huge amount of energy left over—enough to make the biggest bang in the universe. He called these resultant explosions supernovae. They would be—they are—the biggest events in creation.
On 15 January 1934 the journal Physical Review published a very concise abstract of a presentation that had been conducted by Zwicky and Baade the previous month at Stanford University. Despite its extreme brevity—one paragraph of twenty-four lines—the abstract contained an enormous amount of new science: it provided the first reference to supernovae and to neutron stars; convincingly explained their method of formation; correctly calculated the scale of their explosiveness; and, as a kind of concluding bonus, connected supernova explosions to the production of a mysterious new phenomenon called cosmic rays, which had recently been found swarming through the universe. These ideas were revolutionary, to say the least. The existence of neutron stars wouldn’t be confirmed for thirty-four years. The cosmic rays notion, though considered plausible, hasn’t been verified yet. Altogether, the abstract was, in the words of Caltech astrophysicist Kip S. Thorne, “one of the most prescient documents in the history of physics and astronomy.”
The brilliant but volatile astrophysicist Fritz Zwicky, whose paper on supernovae revealed revolutionary scientific ideas, yet whose later theories on black holes and dark matter would largely be dismissed by colleagues who considered him an “irritating buffoon.” (credit 3.4)
Interestingly, Zwicky had almost no understanding of why any of this would happen. According to Thorne, “he did not understand the laws of physics well enough to be able to substantiate his ideas.” Zwicky’s talent was for big ideas. Others—Baade mostly—were left to do the mathematical sweeping up.
Zwicky was also the first to recognize that there wasn’t nearly enough visible mass in the universe to hold galaxies together, and that there must be some other gravitational influence—what we now call dark matter. One thing he failed to see was that if a neutron star shrank enough it would become so dense that even light couldn’t escape its immense gravitational pull. You would have a black hole. Unfortunately, Zwicky was held in such disdain by most of his colleagues that his ideas attracted almost no notice. When, five years later, the great Robert Oppenheimer turned his attention to neutron stars in a landmark paper, he made not a single reference to any of Zwicky’s work, even though Zwicky had been working for years on the same problem in an office just down the corridor. Zwicky’s deductions concerning dark matter wouldn’t attract serious attention for nearly four decades. We can only assume that he did a lot of push-ups in this period.
Surprisingly little of the universe is visible to us when we incline our heads to the sky. Only about six thousand stars are visible to the naked eye from Earth, and only about two thousand can be seen from any one spot. With binoculars the number of stars you can see from a single location rises to about fifty thousand, and with a small 2-inch telescope it leaps to three hundred thousand. With a 16-inch telescope, such as Evans uses, you begin to count not in stars but in galaxies. From his deck, Evans supposes he can see between fifty thousand and one hundred thousand galaxies, each containing tens of billions of stars. These are of course respectable numbers, but even with so much to take in, supernovae are extremely rare. A star can burn for billions of years, but it dies just once and quickly, and only a few dying stars explode. Most expire quietly, like a camp fire at dawn. In a typical galaxy, consisting of a hundred billion stars, a supernova will occur on average once every two or three hundred years. Looking for a supernova, therefore, was a little like standing on the observation platform of the Empire State Building with a telescope and searching windows around Manhattan in the hope of finding, let us say, someone lighting a twenty-first birthday cake.
So when a hopeful and softly spoken minister got in touch to ask if they had any usable field charts for hunting supernovae, the astronomical community thought he was out of his mind. At the time Evans had a 10-inch telescope—a very respectable size for amateur star-gazing, but hardly the sort of thing with which to do serious cosmology—and he was proposing to find one of the universe’s rarer phenomena. In the whole of astronomical history before Evans started looking in 1980, fewer than sixty supernovae had been found. (At the time I visited him, in August 2001, he had just recorded his thirty-fourth visual discovery; a thirty-fifth followed three months later, and a thirty-sixth in early 2003.)
The Reverend Robert Evans with the sixteen-inch telescope he uses to spot supernovae from the sun-deck of his home in New South Wales, Australia. The world’s most successful individual hunter of supernovae, Evans has recorded three dozen sightings. (credit 3.5)
Evans, however, had certain advantages. Most observers, like most people generally, are in the northern hemisphere, so he had a lot of sky largely to himself, especially at first. He also had speed and his uncanny memory. Large telescopes are cumbersome things, and much of their operational time is consumed in being manoeuvred into position. Evans could swing his little 16-inch telescope around like a tail-gunner in a dogfight, spending no more than a couple of seconds on any particular point in the sky. In consequence, he could observe perhaps four hundred galaxies in an evening while a large professional telescope would be lucky to do fifty or sixty.
(credit 3.6)
Looking for supernovae is mostly a matter of not finding them. From 1980 to 1996 he averaged two discoveries a year—not a huge payoff for hundreds of nights of peering and peering. Once he found three in fifteen days, but another time he went three years without finding any at all.
“There is actually a certain value in not finding anything,” he said. “It helps cosmologists to work out the rate at which galaxies are evolving. It’s one of those rare areas where the absence of evidence is evidence.”
On a table beside the telescope were stacks of photos and papers relevant to his pursuits, and he showed me some of them now. If you have ever looked through popular astronomical publications, and at some time you must have, you will know that they are generally full of richly luminous colour photos of distant nebulae and the like—fairy-lit clouds of celestial light of the most delicate and moving splendour. Evans’s working images are nothing like that. They are just blurry black-and-white photos with little points of haloed brightness. One he showed me depicted a swarm of stars in which lurked a trifling flare that I had to put close to my face to discern. This, Evans told me, was a star in a constellation called Fornax from a galaxy known to astronomy as NGC1365. (NGC stands for New General Catalogue, where these things are recorded. Once it was a heavy book on someone’s desk in Dublin; today, needless to say, it’s a database.) For sixty million years, the light from this star’s spectacular demise travelled unceasingly through space until one night in August 2001 it arrived at Earth in the form of a puff of radiance, the tiniest brightening, in the night sky. It was, of course, Robert Evans on his eucalypt-scented hillside who spotted it.
“There’s something satisfying, I think,” Evans said, “about the idea of light travelling for millions of years through space and just at the right moment as it reaches Earth someone looks at the right bit of sky and sees it. It just seems right that an event of that magnitude should be witnessed.”
Supernovae do much more than simply impart a sense of wonder. They come in several types (one of them discovered by Evans), and of these, one in particular, known as the Ia supernova, is important to astronomy because these supernovae always explode in the same way, with the same critical mass. For this reason they can be used as “standard candles”—benchmarks by which to measure the brightness (and hence relative distance) of other stars, and thus to measure the expansion rate of the universe.
In 1987 Saul Perlmutter at the Lawrence Berkeley Laboratory in California, needing more Ia supernovae than visual sightings were providing, set out to find a more systematic method of searching for them. Perlmutter devised a nifty system using sophisticated computers and charge-coupled devices—in essence, really good digital cameras. It automated supernova hunting. Telescopes could now take thousands of pictures and let a computer detect the tell-tale bright spots that marked a supernova explosion. In five years, with the new technique, Perlmutter and his colleagues at Berkeley found forty-two supernovae. Now even amateurs are finding supernovae with charge-coupled devices. “With CCDs you can aim a telescope at the sky and go watch television,” Evans said with a touch of dismay. “It took all the romance out of it.”
I asked him if he was tempted to adopt the new technology. “Oh, no,” he said, “I enjoy my way too much. Besides”—he gave a nod at the photo of his latest supernova and smiled—“I can still beat them sometimes.”
The question that naturally occurs is: what would it be like if a star exploded nearby? Our nearest stellar neighbour, as we have seen, is Alpha Centauri, 4.3 light years away. I had imagined that if there were an explosion there we would have 4.3 years to watch the light of this magnificent event spreading across the sky, as if tipped from a giant can. What would it be like if we had four years and four months to watch an inescapable doom advancing towards us, knowing that when it finally arrived it would blow the skin right off our bones? Would people still go to work? Would farmers plant crops? Would anyone deliver them to the shops?
North American rock art, dating from about a thousand years ago, is thought probably to record one of the great astronomical events of historical times—the supernova explosion that created the Crab Nebula in July 1054. It was one of only a handful of supernova events near enough to Earth to be seen without a telescope. (credit 3.7)
Weeks later, back in the town in New Hampshire where I then lived, I put these questions to John Thorstensen, an astronomer at Dartmouth College. “Oh no,” he said, laughing. “The news of such an event travels out at the speed of light, but so does the destructiveness, so you’d learn about it and die from it in the same instant. But don’t worry, because it’s not going to happen.”
For the blast of a supernova explosion to kill you, he explained, you would have to be “ridiculously close”—probably within ten light years or so. “The danger would be various types of radiation—cosmic rays and so on.” These would produce fabulous auroras, shimmering curtains of spooky light that would fill the whole sky. This would not be a good thing. Anything potent enough to put on such a show could well blow away the magnetosphere, the magnetic zone high above the Earth that normally protects us from ultraviolet rays and other cosmic assaults. Without the magnetosphere anyone unfortunate enough to step into sunlight would pretty quickly take on the appearance of, let us say, an overcooked pizza.
Only half a dozen times in recorded history have supernovae been close enough to be visible to the naked eye.
The gaseous swirl of the Crab Nebula, all that is left of a giant star, about ten times the mass of our own Sun, that blew apart in 1604. Though 6,500 light years from Earth, the explosion was clearly visible in daylight for over three weeks and at night for almost two years. It was dubbed the Crab Nebula in the nineteenth century because of the shape it appeared to present when viewed through early telescopes. (credit 3.8)
The reason we can be reasonably confident that such an event won’t happen in our corner of the galaxy, Thorstensen said, is that it takes a particular kind of star to make a supernova in the first place. A candidate star must be ten to twenty times as massive as our own Sun, and “we don’t have anything of the requisite size that’s that close. The universe is a mercifully big place.” The nearest likely candidate, he added, is Betelgeuse, whose various sputterings have for years suggested that something interestingly unstable is going on there. But Betelgeuse is five hundred light years away—a safe distance.
Only half a dozen times in recorded history have supernovae been close enough to be visible to the naked eye. One was a blast in 1054 that created the Crab Nebula. Another, in 1604, made a star bright enough to be seen during the day for over three weeks. The most recent was in 1987, when a supernova flared in a zone of the cosmos known as the Large Magellanic Cloud, but that was only barely visible and only in the southern hemisphere—and it was a comfortably safe 169,000 light years away.
Supernovae are significant to us in one other decidedly central way. Without them we wouldn’t be here. You will recall the cosmological conundrum with which we ended the first chapter—that the Big Bang created lots of light gases but no heavy elements. Those came later, but for a very long time nobody could figure out how they came later. The problem was that you needed something really hot—hotter even than the middle of the hottest stars—to forge carbon and iron and the other elements without which we would be distressingly immaterial. Supernovae provided the explanation, and it was an English cosmologist almost as singular in manner as Fritz Zwicky who worked it out.
He was a Yorkshireman named Fred Hoyle. Hoyle, who died in 2001, was described in an obituary in Nature as a “cosmologist and controversialist,” and both of those he most certainly was. He was, according to Nature’s obituary, “embroiled in controversy for most of his life” and “put his name to much rubbish.” He claimed, for instance, and without evidence, that the Natural History Museum’s treasured fossil of an archaeopteryx was a forgery along the lines of the Piltdown hoax, causing much exasperation to the museum’s palaeontologists, who had to spend days fielding phone calls from journalists all over the world. He also believed that the Earth was seeded from space not only by life but also by many of its diseases, such as influenza and bubonic plague, and suggested at one point that humans evolved projecting noses with the nostrils underneath as a way of keeping cosmic pathogens from falling into them.
A view of the night sky and the constellation Orion, recognizable by the row of three stars of roughly equal brightness at centre-right, which collectively are known as Orion’s belt. Directly above the belt is Betelgeuse, a red supergiant with an actual luminosity 13,000 times that of our own Sun. Such massive stars are comparatively short-lived and Betelgeuse is destined to become a supernova one day, but at a distance of over 500 light years it is probably no threat to life on Earth. (credit 3.9)
It was he who coined the term Big Bang, in a moment of facetiousness, for a radio broadcast in 1952. He pointed out that nothing in our understanding of physics could account for why everything, gathered to a point, would suddenly and dramatically begin to expand. Hoyle favoured a steady-state theory in which the universe was constantly expanding and continually creating new matter as it went. Hoyle also realized that if stars imploded they would liberate huge amounts of heat—100 million degrees or more, enough to begin to generate the heavier elements in a process known as nucleosynthesis. In 1957, working with others, Hoyle showed how the heavier elements were formed in supernova explosions. For this work, W. A. Fowler, one of his collaborators, received a Nobel Prize. Hoyle, shamefully, did not.
The English cosmologist Fred Hoyle, who coined the term “Big Bang” and showed how supernova explosions could have generated the necessary heat to create the heavy elements that led to the formation of rocky planets and, eventually, us. (credit 3.10)
According to Hoyle’s theory, an exploding star would generate enough heat to create all the new elements and spray them into the cosmos where they would form gaseous clouds—the interstellar medium, as it is known—that could eventually coalesce into new solar systems. With the new theories it became possible at last to construct plausible scenarios for how we got here. What we now think we know is this:
About 4.6 billion years ago, a great swirl of gas and dust some 24 billion kilometres across accumulated in space where we are now and began to aggregate. Virtually all of it—99.9 per cent of the mass of the solar system—went to make the Sun. Out of the floating material that was left over, two microscopic grains floated close enough together to be joined by electrostatic forces. This was the moment of conception for our planet. All over the inchoate solar system, the same was happening. Colliding dust grains formed larger and larger clumps. Eventually the clumps grew large enough to be called planetesimals. As these endlessly bumped and collided, they fractured or split or recombined in endless random permutations, but in every encounter there was a winner, and some of the winners grew big enough to dominate the orbit around which they travelled.
It all happened remarkably quickly. To grow from a tiny cluster of grains to a baby planet some hundreds of kilometres across is thought to have taken only a few tens of thousands of years. In just 200 million years, possibly less, the Earth was essentially formed, though still molten and subject to constant bombardment from all the debris that remained floating about.
At this point, about 4.4 billion years ago, an object the size of Mars crashed into the Earth, blowing out enough material to form a companion sphere, the Moon. Within weeks, it is thought, the flung material had reassembled itself into a single clump, and within a year it had formed into the spherical rock that companions us yet. Most of the lunar material, it is thought, came from the Earth’s crust, not its core, which is why the Moon has so little iron while we have a lot. The theory, incidentally, is almost always presented as a recent one, but in fact it was first proposed in the 1940s by Reginald Daly of Harvard. The only recent thing about it is people paying any attention to it.
When the Earth was only about a third of its eventual size, it was probably already beginning to form an atmosphere, mostly of carbon dioxide, nitrogen, methane and sulphur. Hardly the sort of stuff that we would associate with life, and yet from this noxious stew life formed. Carbon dioxide is a powerful greenhouse gas. This was a good thing, because the Sun was significantly dimmer back then. Had we not had the benefit of a greenhouse effect, the Earth might well have frozen over permanently, and life might never have got a toehold. But somehow life did.
For the next 500 million years the young Earth continued to be pelted relentlessly by comets, meteorites and other galactic debris, which brought water to fill the oceans and the components necessary for the successful formation of life. It was a singularly hostile environment, and yet somehow life got going. Some tiny bag of chemicals twitched and became animate. We were on our way.
Four billion years later, people began to wonder how it had all happened. And it is there that our story next takes us.
William Blake’s fanciful and flattering nineteenth-century depiction of Isaac Newton, a man whose rigid scientific theories clashed with many of Blake’s own beliefs. (credit p2.1)