Acknowledgments

Many people have supported me, encouraged me, advised me, and managed not to get fed up with me (I hope!) during the writing of this book. In particular, James has been a joy and a blessing to work with, and he has kept me on the straight and narrow throughout. Danny Ratcliffe’s illustrations are excellent. Martin Steel, Danny Byrne, and Angie Edwards have worked through more drafts than it is worth mentioning; their wise words have been a great help. Tom McLeish inspired me to keep going when there were some early setbacks with the project. My parents, Ian and Lynne, have read every paragraph, maintaining a loving faithfulness that they have shown to me my whole life. My precious daughters, Bethany and Chloe, have been a constant reminder of why the truth matters—may they grow up around it and treasure it. And my wife, Emma, is the perfect teammate through it all—thank you, for everything.

D. Hutchings

I’m grateful to David for taking up this project. Scholars have the tendency of only speaking to each other. But David has shown us that we do this at our own peril. As a secondary school physics teacher, he takes the work of scholars and translates it for readers outside of the academy. Working with him on this book has served as a reminder of how important that task really is. The “conflict thesis” remains alive and well because not only have historians of science and religion misunderstood its origins, development, and popularization but we have failed to speak to that larger audience. So I’m grateful not only for the energy and enthusiasm of his writing but more importantly for the empathy and concern he has shown toward our readers.

J. C. Ungureanu

1

Fooling the World

image

Kaysing’s Case • The Truth About Lies • The Last of the Polymaths • Conflict • A Brave Man, Wronged • Warfare • The Ultimate Double Act • Making Myths • A Legacy of Lies • Harmful in Itself • A Twist in the Tale • Of Popes and Unicorns

Kaysing’s Case

Bill Kaysing, like 500 million others, was glued to his TV screen. Unlike them, however, he was not being transported to a state of otherworldly wonder by what he was seeing. Instead, he was positively fuming. As far he was concerned, what was happening right now in front of a billion trusting eyes was nothing other than an outright betrayal of humanity. NASA was taking everyone for fools. Unabashed, on live television worldwide, they were faking the moon landings.

And he wasn’t going to let them get away with it.

***

Seven years later, Kaysing was ready. His grand project—We Never Went to the Moon—was finally finished.1 After copious amounts of research, he had produced what the world really needed: a definitive and watertight account of what had truly transpired in and around July 20, 1969.

Long before that fateful date, he explained, NASA—under pressure from the government, the Russians, and even its own overconfident spin—had realized it was in big, big trouble. Its promise to land astronauts on the lunar surface by the turn of the decade had proved far more difficult to achieve than anyone had originally envisaged. Worried, embarrassed, and trapped in a corner, the big bosses had realized they had no other choice—they were simply going to have to pretend to do it.

It is tempting to assume that We Never Went is just paranoid nonsense, but that is far too simplistic a response. Kaysing was no small-town, establishment-hating, tin hat–wearing nutcase ranting at authority from the outside in—no, he had worked for Rocketdyne, the company that built the engines for NASA’s Saturn V rockets. He had enjoyed regular firsthand access to sensitive documents about the Apollo missions. He knew many of the engineers personally. And, what’s more, he knew how to build a good case.

Firstly, he pointed out, pretty much all of the internal experiments, trials, and simulations had been disastrous: it was not uncommon for the errors to run into the tens of thousands per test. Indeed, he happened to know that the odds of a successful trip had been calculated in-house as just 1 in 60,000—and yet here was NASA, the new darling of the American populace, now claiming to have completed two of them, entirely untroubled, in a row.

Secondly, the photographic evidence on offer was highly suspect. Stars were missing; shadows pointed in the wrong direction; video was unjustifiably grainy; moondust was mysteriously undisturbed. If this was supposed to be proof, it left much to be desired.

Thirdly, Kaysing also noted that the astronauts involved with Apollo only ever seemed to end up in one of two states: rich or dead. In his own words, some arrived back to Earth and were suddenly appointed as “executives in large corporations”; others died before their missions in suspicious “accidents.” Could it be that those who played along were rewarded, while those who refused were unceremoniously bumped off?

His arguments went on: diplomatically, the United States needed to overtake the Soviets in the space race; the previously unclassified Apollo documentation had been suddenly rendered “unavailable to the public”; NASA had covered up multiple rocketeering disasters in the past; the moon footage bore more than a passing resemblance to 2001: A Space Odyssey, which had been filmed during the same period.

Busily quoting panicking engineers, skeptical statisticians, and wary historians, Kaysing painted a picture of failure, desperation, and conspiracy which was hard to ignore. Others soon joined him, citing evidence of their own. Some physicists believed the high levels of space radiation would have finished the crew off long before they had completed their mission. Innovative sleuths unearthed photos of the astronauts—supposedly on the moon at the time—not even wearing their spacesuits. The doubts grew. Would NASA eventually cave? Was it time to admit the game was up?

The rumors continued to rumble and, in 2001, Fox TV decided to press the national body hard on the issue. The documentary Conspiracy Theory: Did We Land on the Moon? featured interviews with NASA spokespeople, investigative journalists, astronauts, scientists, and even Kaysing himself. Its conclusion? The question, it decided, remained genuinely open.2

The Truth About Lies

Fox’s show and its non-verdict are now two decades old—so what has happened since? Has the explosion of internet use, better home technology, more public access to government files, and the overwhelming force of social media led to the exposure of a corrupt space administration? Should Neil Armstrong go down in history as a gifted actor rather than a swashbuckling pioneer? Had Kaysing spotted the truth before anyone else managed to?

Well, the simple fact is this: the Apollo 11 mission really did succeed in putting men on the moon. Kaysing was wrong. Since his death in 2005, new and far clearer photographs have been taken of the landing area. Sure enough, the telltale signs are there: astronaut footpaths, abandoned vehicles, assorted debris—and a quick online search can confirm it.3

None of this, by the way, would come as much of a surprise to David Grimes. During a period of research at Oxford University, the imaginative physicist applied his mathematical wherewithal to the analysis of multiple historical conspiracies. His exhaustive study confirmed what we might already suspect: the higher the number of people asked to keep a secret, the smaller the chance of it actually being kept. Putting it more bluntly, large-scale attempted cover-ups don’t work.

The implications for Kaysing’s theory are not great: NASA employed no fewer than 400,000 staff during the Apollo program. Grimes’s model, applied on this scale, tells us that the likelihood of keeping them all quiet for any longer than five years isn’t just low—it is zero.4

Interestingly, formal photographs and expert research haven’t proved enough to convince everyone—an estimated 20 million Americans still believe that NASA really did make the whole thing up.5 The new images, they maintain, are also fake. And, why, they might ask, should we believe an Oxford-based finding? After all, they are probably in on it anyway.

Before we despair too much, though, we should remember that most people are not ‘hoaxers’—there are, thankfully, more than 300 million US citizens who readily accept the reality of NASA’s feat. Most people, even in this high-profile example, have not been duped. Even when one has a semi-convincing and famous case, it seems, the majority of the public won’t buy it. The man and woman on the street are a lot harder to fool than many would have us believe.

Of course, if tricking average Janes and Joes is tough, then getting one over on a group of well-informed professionals is nigh-on impossible. No scientist of any repute, for example, thinks the moon landings are fake (Kaysing, as it happens, was a jobbing technical writer at Rocketdyne, and had no scientific qualifications). Similarly, there are no ranking engineers who think the World Trade Center attacks were an inside job; there are no elite historians who think the pyramids were built by aliens.

In short, the people who are paid to know their stuff usually do—or, at least, they know it well enough not to be fooled by conspiracy theorists. And, sooner or later, their expert views trickle down to the public—which kills the story (for the vast majority, anyway) stone cold dead.

So here’s the thing: it is a very difficult thing to con the world. Getting even a handful of people to believe in a grand, all-encompassing untruth is hard. Winning over greater numbers than that is extremely challenging. Deceiving just one expert, even for a short while, is highly improbable. The idea, then, that someone might get a significant proportion of the academic community to believe and then promote a lie for a generation or more is utterly beyond the pale.

And yet—staggeringly, shockingly, astonishingly—it has already been done. Nearly 150 years ago, two men who pretty much no one has now heard of set out to convince both the general and the highly educated public of a falsehood—and pulled it off. They have fooled the minority, and the majority. They have fooled a lot of the experts. Their alternative version of history—one which is quite easy to show is untrue—remains the most common view to this day. Somehow, against all of the odds, they have successfully fooled the world.

So, who were they? What did they say? And how, exactly, did they get away with it?

The Last of the Polymaths

Occasionally, as the years roll ever onward, certain types of people simply vanish. Blown aside by the winds of change, they are suddenly no longer needed—their roles become obsolete, and they are quietly transformed into distant memories of a past age. Think highwaymen, milkmaids, alchemists, or knights in shining armor—once commonplace, they have been swallowed up by time. Each has gone the same way, lost to all but stories.

In our own day, the phenomenon of subject specialism has added yet another victim to this list: the polymath, or master of all trades. In the university of today, the expectation—or even the idea—that someone might make a major contribution in more than one discipline is simply not there anymore. Polymaths are dead and gone.

When modern scientists win a Nobel Prize, for example, their achievement is usually so focused and intricate that even highly informed correspondents struggle to explain it. In a climate like this—one where advances tend to come in minute technical detail, and are understood by so few—it is easy to see why we might have run out of polymaths. After all, who could possibly have the time or ability to excel in more than one field? Wide-ranging, multifaceted thinking has been consigned, it would seem, to the past.

There is something remarkably sad about this. Modernity is missing out by constraining genius with such tight one-subject-only cords. We will have no Zhang Heng; we will have no Avicenna. Zhang, in the second century AD, wrote entirely new forms of poetry, compared histories of the world to find errors, calculated pi, and invented a machine which could measure both the size and location of earthquakes. Avicenna opened up the second millennium by writing proofs of God’s existence, devising thought experiments about consciousness, showing that light must have a finite speed, perfecting steam distillation, and arguing—against Aristotle, no less—that the stars produced their own light.

Specialism has weeded out the polymath from the gardens of today—so, to find the last of them (or, at least, to find them in any significant numbers), we must go back in time. Let us head then, to AD 1811—and to a small town found somewhere in the north of England.

John William Draper was born in St. Helens that year. The son of an itinerant Wesleyan minister, he spent his youth being dragged around by his father’s varying employment from church to new church. Worried about the unsettling nature of all this, the clerical body asked for a school be founded in the north, and eventually Woodhouse Grove was established. Young Draper was promptly packed off there aged 11, and lasted a few years before he came home again. He then found his way to university in London, where he studied chemistry. It was at that point that everything changed for him: his father died.

This shook the whole family, and resulted in a dramatic move: Draper, along with his mother and sister, uprooted themselves and headed to Virginia in the United States of America. It was to be here, in the land of promise, that Draper would truly find himself. A man of inquisitive mind and determined character, he threw himself at the fresh opportunities that came his way, always with great success. By the time of his death in 1882, he would be considered by many as worthy of standing alongside the likes of Zhang and Avicenna—for he had become a genuine polymath.

Draper began his path to legendary status with a degree in medicine from the University of Pennsylvania. His work there on the movement of gases and liquids through membranes proved so impressive that he was awarded a professorship in Virginia (1836) and, swiftly, another in New York in 1839.

The very next year, he made technological history: he took the first ever clear photographs of a human face, and then of the moon. Changing tack once more, he went on to demonstrate that metals of different types would all begin to glow at the same temperature (the Draper point); he matched flame colors to the substance being burned; he unearthed a fundamental link between a radiating material’s multi-colored “spectral lines” and its composition.

He didn’t stop there, either. Soon, Draper discovered that certain combinations of objects could be “electrified” if pulled apart, and used this finding to improve the design of batteries. He helped with the development of the telegraph, making it economically much more viable. He made huge steps in the way we think about physiology by attributing the processes in our body to the laws of chemistry and physics, and rejecting the popular (and vague) idea of “living force.”

Such was Draper’s brilliance that even the vast and burgeoning realm of science was not enough to contain him. Branching out yet again, he joyfully immersed himself in a hotchpotch mixture of history, philosophy, politics, sociology, and religion. Re-energized, he began to write down some of his own thoughts on all these—and, to his great pleasure, his writings proved popular. Witty, insightful, and ridiculously wide in scope, his prose had a rhythm and drive that raised it above the usual esoteric ramblings of other enthusiastic academics—he was quite the hit.

In 1863, he published A History of the Intellectual Development of Europe. Like a master storyteller, Draper drew together the tales of myriad countries, centuries, governments, wars, cultures, religions, and scientific theories to form a single overarching narrative of relentless, positive, human progress—progress which, he believed, was driven by some deeply mysterious and all-encompassing natural law.

It was remarkably well received. The Westminster Review, a cutting-edge and radical publication, called it “a noble and even magnificent attempt to frame the induction from all the recorded phenomena of European, Asiatic, and North African History”6 and went on to say that “Dr Draper soars to a height of eloquence not commonly met with in, yet by no means impairing the cogency of, a strictly philosophic treatise.”7

It sold well, too. This encouraged Draper to keep writing, and soon came his History of the American Civil War, which also became a classic.

By the 1870s, then, Draper’s reputation was sky-high. It seemed he could do no wrong—everything he touched turned into intellectual gold. Not only was he considered one of the greatest living scientists in a legendary era of science, but he had become an acclaimed historian, philosopher, and bestselling author to boot. Whenever Draper spoke—no matter the topic—the world, in awe of him, listened.

And, before the end of the decade, he would have much of this listening world hopelessly fooled.

Conflict

Buoyed by his various successes, Draper now turned his mind to an entirely new topic altogether. What, he wondered, was the real relationship between religious faith and scientific knowledge? Unafraid of controversy, and believing himself to be best placed to comment, Draper decided to write what he hoped would soon be the go-to text on the matter. The title of his study betrayed its conclusion: A History of the Conflict Between Religion and Science.

Essentially, Draper’s 1874 manifesto ended up describing a war. In its opening salvo, he sets the tone for the rest of the work:

The antagonism we thus witness between Religion and Science is the continuation of a struggle that commenced when Christianity began to attain political power.

And, rather than remaining a neutral observer, Draper appears, at least, to favor one side of this struggle over the other:

The history of Science is not a mere record of isolated discoveries; it is a narrative of the conflict of two contending powers, the expansive force of the human intellect on one side, and the compression arising from traditionary faith and human interests on the other.

Science, here, is coming across in a much better light than “traditionary faith.” That trend continues:

As to Science . . . she has never subjected any one to mental torment, physical torture, least of all to death, for the purpose of upholding or promoting her ideas. She presents herself unstained by cruelties and crimes. But in the Vatican—we have only to recall the Inquisition—the hands that are now raised in appeals to the Most Merciful are crimsoned. They have been steeped in blood!

Conflict, then, is uncompromising in its critique of organized religion, and gushing in its praise of freethinking science. Bishops are its baddies; geologists its goodies. Interestingly, and despite his seemingly firm decision to opt for one team over the other, Draper maintains that he has always “endeavoured to stand aloof, and relate with impartiality their actions.”

The chief finding of his “impartiality” is this: religion will always fight against and try to hold back science, to the great loss of everyone everywhere. Draper—the polymath son of a minister—sums up his case with a startlingly stark ultimatum:

Then has it in truth come to this, that Roman Christianity and Science are recognized by their respective adherents as being absolutely incompatible; they cannot exist together; one must yield to the other; mankind must make its choice—it cannot have both.8

Draper, then, had produced a decisive account for the masses. Conflict spanned more than two millennia of people, and ideas, and events. It was a wise and learned tome, one eminently worthy of ending any debate once and for all. In his own words, “No one has hitherto treated the subject from this point of view.” And that, he felt, was that.

Yet despite his popularity, and experience, and breadth of knowledge, and clout, Draper’s final conclusion was still a controversial one. Perhaps, on its own, Conflict might not have been quite enough to settle the matter.

The thing is, though, it wasn’t on its own—not for long.

A Brave Man, Wronged

Andrew Dickson White (1832–1918) was born in New York with a family background in both money and education. Initially, his parents expected him to go into the ministry, but he found the first steps on that journey both boring and unpleasant, and ran away from his clerical schooling. Eventually, his father relented and let him study history and English literature at Yale instead.

Once there, White quickly showed his brilliance. He won a number of prizes for his essays and public speaking, including one valued at $100—the highest award available at any university in the world at the time. After graduating, he traveled around Europe for three years and, upon his return, took up a professorship in history and English literature in Michigan.

Lecturing away to his heart’s content, White was free to cover an incredible array of subjects over the next few years. He taught on the Roman Empire; the rise of cities; the Crusades; the growth of papal power; medieval Christianity; Islam; parliament in England and France; the revival of learning and art; Luther; the Jesuits; the Thirty Years War; Louis XIV, XV, and XVI; the French Revolution, and even the history of philosophy.

During his European excursions and his subsequent time in Michigan, a dream had begun to well up in White’s heart. With each day that passed he longed, ever more deeply, for his very own university. His, he had decided, would be an institution free from any kind of dogmatic oversight; a setting in which students could be genuine and unshackled freethinkers. White believed that, in an environment like this, their uncontrolled minds would settle happily upon such virtues as goodness, purity, selflessness, and peace. This vision moved him profoundly—but how, he wondered, could it ever actually come about?

Then, in the 1860s, White’s rich lecturing career was brutally interrupted by the American Civil War. He was promptly nominated for and elected to the New York State Senate (which was a bit of a surprise to him), forcing him to resign from his beloved teaching and research. It was not all bad news, though—for, in this new and unwanted role, White finally stumbled across the chance to fulfill his personal destiny.

Hope had appeared in the form of White’s new fellow senator, Ezra Cornell (1807–1874). Cornell was seriously rich—but, as a committed Quaker, he was not all that interested in hoarding cash, or buying luxuries, or collecting trinkets. Instead, he wanted to do something worthwhile. White, of course, was ready with a suggestion: together, the two of them should found an innovative, modern university, one which he promised his potential partner would help contribute to humanity’s next big step forward.

Before too much longer, Cornell University became a serious proposal: a bold establishment where religion and dogma would not hold back study and where students would be wholly unfettered. White, the dreamer, was hugely excited. What happened next, therefore, not only shocked him—no, it came close to breaking his heart.

White, for what it’s worth, had been extremely careful when explaining Cornell’s guiding principles to its trustees in 1866—he clearly felt that he was guiding the world toward the light:

We have under our charter no right to favour any sect or promote any creed. No one can be accepted or rejected as a trustee, professor or student, because of any opinions or theories which he may or may not hold. . . . Development under this principle—moral, intellectual and physical—can only be normal and healthful in an atmosphere of love of truth, beauty and goodness.9

Imagine his surprise and dismay, then, when others began to speak against his project. For, rather than being praised from the rafters, White’s enterprising brainchild faced stern opposition from the off.

Much of the criticism took the same basic form: that Cornell would be irreligious. It would open up young minds to dangerous ideas, he was warned—it would lead them astray from Christian truth. If this new place of “learning” was allowed to flourish, his detractors said, it would merely be the thin end of a wedge. Students would be misled, and alienated from God—with disastrous consequences.

White was horrified. He wrote to Cornell, grieving that “the papers, addresses, and sermons on our unchristian character are venomous.”10 Wobbled, but not unseated, the duo pressed on regardless. As a result, the university did indeed open in New York in 1868—and it took in the biggest entering class of any American school in history. It quickly became a very successful operation, and it still is today. Its current motto is taken from Ezra Cornell’s opening address that very year:

I would found an institution where any person can find instruction in any study.

White’s anger, however, continued to simmer. For him, the vociferous attacks on his plans had confirmed what he had suspected for quite some time—that dogmatic religion was only ever a dreadful thing. Doctrinal inflexibility of the type displayed by the supposedly holy opposers of Cornell was, he eventually concluded, the mortal enemy of “truth, beauty, and goodness.”

And perhaps, he thought, it was time to fight back.

Warfare

White became a man with a plan: he would bring dogmatism down, and he would use his two biggest guns—public speaking and persuasive writing—to do so. It wasn’t long until the first shooting opportunity came along: in 1869, White was invited to give a talk of his choice at Cooper Hall in New York. He accepted—and his resulting speech would forever change our history.

Many in the audience were still reeling from the devastating effects of the American Civil War—families bereft, memories scarred, relationships strained. Despite the likely sensitivities in the room, however, White quite deliberately picked war as his dominant theme—he called his lecture “The Battle-Fields of Science.”

Already close to the bone with his choice of motif, White pressed the knife in still further. In his introduction, he told his hearers that he would be describing hostilities “with battles fiercer, with sieges more persistent, with strategy more vigorous than in any of the comparatively petty warfares of Alexander, or Caesar, or Napoleon.”11

With fire in his belly and Cornell in his head, White began to regale his listeners with story after story of conflict between two violently opposed forces—one heroic, one villainous. The lesson to be learned from history, he said, was a simple one: religion was the enemy of science. He powered on, naming case after case in which dogma had damaged cosmology, or astronomy, or chemistry, or anatomy. It was nothing less than a broadside. And it was nothing more than the beginning.

The very next day, the New York Daily Tribune—whose editor, rather conveniently, was a trustee of Cornell—printed the lecture in full. Tongues were set wagging, both in academia and on the street. White was asked to repeat the lecture again and again, all over the country. He did so, adding new explosive material each time. Eventually, in 1876—less than two years after Draper’s Conflict—White’s expanded and amended lecture was published as a pamphlet, for anyone to read at their leisure.

White, however, was not done. He continued to write on the subject, and had his work serialized, for more than a decade, in the well-known and daringly modern Popular Science Monthly magazine. Finally, in 1896, his vast survey of history, philosophy, physics, theology, biblical criticism, biology, sociology, and more was pulled together into one totemic volume—a lengthy, exhaustive, and ruthless attack on dogmatic religiosity everywhere. He called it A History of the Warfare of Science with Theology in Christendom.

In its introduction, White linked the development of his book directly to his experience with Cornell:

As honored clergymen solemnly warned their flocks first against the “atheism,” then against the “infidelity,” and finally against the “indifferentism” of the university, as devoted pastors endeavored to dissuade young men from matriculation, I took the defensive.

He had initially reacted with “sweet reasonableness,” he said—but that had done “nothing to ward off the attack.” Then, finding himself frustrated, upset, and at a loss with what to do about it all, he admits that something just suddenly snapped deep within:

Then it was that there was borne in upon me a sense of the real difficulty—the antagonism between the theological and scientific view of the universe and of education in relation to it.12

This book, then, was his solution: White would change the way people thought about science and religion forever. It was time for the truth to be told, and for the world to move forward. Science—and the freethinking associated with it—must be unleashed. Religious dogma, its age-old enemy, must be finally and decisively killed off. Nothing less than outright victory would suffice.

The Ultimate Double Act

White’s titanic Warfare dwarfed Draper’s popular-length Conflict. It read differently, too. While Draper’s book was pithy, witty, and even conversational at times, White’s was more like a colossal encyclopedia entry—albeit with the odd outbreak of fury and despair at the sheer ridiculousness of theologians. Each covered roughly the same ground, but White did so in far more detail, and added extra topics on top.

The highly complementary nature of the two works turned out to be remarkably important. Perhaps each book on its own would have been forgotten—but instead, as a double act, they more than reinforced each other. That Conflict and Warfare appeared in the same era, had the same broad scope, and agreed on so much added a great deal of weight to their claims. Even their dissimilarities were beneficial—they appealed to different readers, and so their combined message spread both far and wide.

Consider the following, for example: Draper took aim at the Catholic Church; White held that Protestants were just as bad, if not worse. Draper never revised his book once; White’s was the constantly tweaked accumulation of more than twenty years of essays. Draper wrote as a world-renowned scientist; White was a famously gifted politician-historian. Conflict included not a single footnote, making for a smoother read; Warfare had literally thousands of them, suggesting airtight reliability.

In short, they formed an astonishingly effective pincer movement. When both books were considered together, their case appeared to be fully made. Such was the combined influence of Conflict and Warfare that they genuinely managed to achieve precisely what we earlier stated was impossible. They sat on bestseller lists for decades, sold all across the globe, were translated into all sorts of languages, and became the twin final words on their topic.

Between them, then, John William Draper and Andrew Dickson White did more than launch a small-scale conspiracy theory, or gather a handful of limited but loyal followers for a couple of years. Instead, they fooled the world.

Making Myths

A major point in both texts was this: it is a pitiful and embarrassing thing for grown and intelligent people to believe in myths. The Bible’s stories of miracles, they claimed, were little better than fairy tales—science, if only people would listen to it, could free them from such imaginary nonsense. And yet, in their desperation to dismiss dogma, our two authors managed to fall foul of their own accusations—for, during their two projects, they uncritically accepted, wholeheartedly believed, and then willfully propagated an entire collection of myths of their own.

For example, Conflict and Warfare portrayed a Christianity that killed off Greek and Roman science and plunged Europe into the so-called Dark Ages—a thousand years of fecklessness, stupor, and backward thinking. Christendom, said Draper and White, destroyed ancient libraries; it forbade the study of philosophy; it tormented or tortured or killed anyone remotely scientific lest its hopelessly inaccurate scriptures be exposed.

There was more: the ever-dogmatic Church, they explained, maintained that the Earth was flat (until it was proved wrong by Columbus); it tried to strangle science at the first signs of the Enlightenment; it resolutely rejected any concept of natural law. It hated Nicolaus Copernicus for dethroning the Earth; it decried all claims that the Earth moved; it hurled the heroic Galileo Galilei into prison; it burned Giordano Bruno for his astronomical foresight.

Worse still, institutional Christianity also blamed all illness on demons; it prohibited medical intervention of any kind; it banned dissection; it banned autopsies. It refused anesthetic to women in labor. Interpreters of Holy Writ had unanimously denied heliocentrism, evolution, an old Earth, and a vast universe until their puny sacred arguments could no longer bear the weight that had been placed on them. In short, the Church had stood, with moronic steadfastness, against all scientific endeavor for more than 1500 years—until, as a long-overdue reward for their painful perseverance, the brave and logical scientists finally broke free.

All these grand and saddening assertions have three things in common. Firstly, they form the backbone of the Draper–White narrative. Secondly, they have since become common knowledge, and are repeated in casual conversation, newspaper articles, popular science and history, plays, newspapers, documentaries, and even academic treatises. Thirdly, we now know that none of them—not a single one—is actually true.

A Legacy of Lies

The series of myths that Draper and White spread about science and religion are known today in the literature as the conflict thesis. Thanks to the dedicated and committed research of a band of specialists operating since the 1980s at least, the conflict thesis has now been thoroughly debunked. One by one, the tales spun out in Conflict and Warfare have been shown to be either entirely false, horribly misunderstood, or deliberately misrepresented.

Ronald L. Numbers, for example, is perhaps the foremost historian of science and religion alive today. Here is his damning assessment of the conflict thesis:

The greatest myth in the history of science and religion holds that they have been in a state of constant conflict. No one bears more responsibility for promoting this notion than two nineteenth-century American polemicists: Andrew Dickson White and John William Draper. . . . Historians of science have known for years that White’s and Draper’s accounts are more propaganda than history. . . . Yet the message has rarely escaped the ivory tower. The secular public knows that organized religion has always opposed scientific progress. . . . The religious public knows that science has taken the leading role in corroding faith.13

This passage is from the introduction to Galileo Goes to Jail—an edited volume, published by Harvard University Press, which consists of essays by twenty-five of the world’s very best thinkers on these issues. There is a clear, evidence-based consensus among this group: the conflict thesis is utter bunk. Yet the frustration, on Numbers’s and his colleagues’ part, is clear from the excerpt: no one, it seems, is actually listening to them.

Over the last century and a half, despite its untruth, the God-or-science mantra has become firmly embedded in our culture. Indeed, Draper’s forceful summation of it—his “cannot have both” formulation—can be found almost everywhere. A 2013 survey of high school students in the United Kingdom, for instance, found that the majority agreed with the statement “the scientific view is that God does not exist.”14

These youngsters are not alone. The New Atheist writers—the likes of Richard Dawkins and Sam Harris—have been rehashing Draper and White, in one way or another, for years. As an avid polemicist, Dawkins is perhaps the duo’s most famous intellectual descendent—indeed, he struggles to get through even a paragraph about science before reminding his readers that “Religious beliefs are dumb and dumber: superdumb.”15

In the meantime, Dan Brown—bestselling author of more than 250 million novels—has done even better than Dawkins and made far more than just a name for himself from Draper’s and White’s ideas. In Angels and Demons (think a heroic particle physicist and a morally questionable pope) the legacy of both Conflict and Warfare is palpable:

“Mr. Langdon, all questions were once spiritual. Since the beginning of time spirituality and religion have been called on to fill in the gaps that science did not understand. The rising and setting of the sun was once attributed to Helios and a flaming chariot. Earthquakes and tidal waves were the wrath of Poseidon. Science has now proven those gods to be false idols. Soon all Gods will be proven to be false idols. Science has now provided answers to almost every question man can ask. There are only a few questions left and they are the esoteric ones. Where do we come from? What are we doing here? What is the meaning of life and the universe?”

Langdon was amazed. “And these are questions CERN are trying to answer?”

“Correction. These are questions we are answering.”16

Should we really take Brown or the New Atheists as evidence, though, that Draper and White have got to everyone? After all, Brown is writing make-believe, and the New Atheists are hocking just about anything that opposes religion. They can hardly count as “everyone,” then, can they?

Peter Byrne, on the other hand, is a different case altogether. As an award-winning investigative reporter and a science writer of some repute, Byrne has no need to invent storylines (as per Brown), and has no obvious axe to grind (as per Dawkins). In one of his books, however—an acclaimed biography of the quantum theorist Hugh Everett—we can find the following: “Vatican attacks on a scientific theory usually are a sign that it is intelligent and correct—just ask Copernicus, Galileo, Bruno, Newton, Darwin, and Einstein.”17

Here, then, in a book which is not even interested in their subject matter, the spirit of Conflict and Warfare remains both alive and well. Byrne’s assertion is straight out of the Draper–White tradition—indeed, the pair would probably have been proud of such a line.

It is a good line, too—but it is wrong. Copernicus himself was ordained, and was never opposed by any church.18 Galileo could not prove his heliocentric case scientifically, but half of the Vatican supported him regardless of that fact.19 Bruno got into trouble for his theology, not for his science (which was, as it happens, not really the type of science that Byrne might have imagined, anyway).20

The Catholics liked Newton’s theories, for he explicitly wrote that they came from, and required, God. Darwin’s work was admired by just as many believers as despised it—the Catholic Church, incidentally, favors evolution.21 Einstein’s deductions faced no religious opposition at all. Instead, his ideas were embraced by the Catholic priest Georges Lemaitre, who then used them to devise the Big Bang model of the universe.

Byrne, then, is mistaken on all counts—so how did he ever come to write such a sentence? He is, quite clearly, not stupid. His text is not the self-published rant of an angry conspiracy theorist. His publisher is Oxford University Press. His title is endorsed, among others, by New Scientist, by the BBC, and by a Nobel Prize–winning physicist. What is going on here?

Well, Byrne’s book is not actually about the conflict thesis. Neither is it about science and religion. Instead, it is a biography of a quantum physicist, and the statement we picked out is in no way central to its theme. If anything, it is an isolated and throwaway line. There is no footnote. The topic is not dwelt on any further. It is, at best, an aside.

Strangely, though, the fact that it is an aside makes it more important, not less. By simply nodding at the conflict thesis like this, Byrne implicitly champions it. He assumes it to be common knowledge; he feels that no further explanation is necessary. There are no two ways about it: he has been duped. Somehow, reaching out across more than a century, Draper and White have got to Byrne—who, we must remember, is an expert in comparison to most—just like they have got to almost everyone else.

Still, does it really matter all that much?

Harmful in Itself

Why should anyone outside of a lecture hall care in the least about the conflict thesis? Can’t this whole debate quietly play out in the libraries of our universities, just like many other idiosyncratic and irrelevant disagreements that secure the steady study of scholars on second-rate salaries?

The answer is no. David Aaronovitch, author of the myth-busting Voodoo Histories, explains why:

There is a more sinister aspect to jovial arguments about whether or not the moon landings actually took place, and to the speculation about why we enjoy such arguments. The belief in conspiracy theories is, I hope to show, harmful in itself. It distorts our view of history and therefore of the present, and—if widespread enough—leads to disastrous decisions.22

Disastrous decisions? Isn’t Aaronovitch being a little over-dramatic here? Well, perhaps we should ask Michael Reiss—an evolutionary biologist who, in 2008, lost his job with the Royal Society after a journalist misunderstood his comments on creationism and the modern Conflict–Warfare gang got hold of him. Sir Harry Kroto, a Nobel laureate, demanded Reiss’s head immediately. Channeling his inner Draper and White, Kroto wrote, “There is no way that an ordained minister—for whom unverified dogma must represent a major, if not the major, pillar in their lives—can present free-thinking, doubt-based scientific philosophy honestly.”23

Or ask Elaine Howard Ecklund, an acclaimed social scientist working at Rice University who, in 2016, surveyed nearly 10,000 scientists around the world on the matter.24 She found that even though more of them described themselves as religious than as atheists, there was a dominant elitism coming from the smaller group. Discrimination on the grounds of religion was commonplace, she discovered. In many cases, this included bullying, name-calling, discrediting individuals or groups, and even passing them over for jobs.25

Or ask Tom McLeish, decorated physicist and fellow of the Royal Society who, in 2017, dared to write a book about his Christian faith entitled Let There Be Science: Why God Loves Science, and Science Needs God. Influential American biologist Jerry Coyne was having none of it, and was especially enraged by McLeish’s significant role in public science. In his review of the book—during which he admits that he has not actually read it—Coyne sneered, “Chair of the Royal Society’s education committee? What the bloody hell is a theist doing in that position?”26

His readers—who number in the tens of thousands—then proceeded to stir themselves into a fury at the thought of a practicing scientist having a faith. One of them even asked Coyne directly, “don’t you weary of fighting these assholes?”

Conflict and Warfare, it would seem, have become dangerously self-fulfilling prophecies—it can get quite horrible out there at times.

Which, by the way, is the last thing that Draper and White would ever have wanted.

A Twist in the Tale

Draper himself was no atheist; neither was White. They were not even agnostic. In fact, both thought of themselves as followers of Christ, and viewed their books as significant contributions to his cause. They were writing, the two of them said, not to push science and religion ever further apart, but instead to bring them both back together.

Here is White, in the preface of Warfare: “My conviction is that Science . . . will go hand in hand with Religion.” Indeed, he preaches like a revivalist, extolling the teachings of Jesus:

Religion, as seen in the recognition of “a Power in the universe, not ourselves, which makes for righteousness,” and in the love of God and of our neighbor, will steadily grow stronger and stronger, not only in the American institutions of learning but in the world at large. Thus may the declaration of Micah as to the requirements of Jehovah, the definition by St. James of “pure religion and undefiled,” and, above all, the precepts and ideals of the blessed Founder of Christianity himself, be brought to bear more and more effectively on mankind.27

Draper, for his part, also saw the historical God-versus-science battle as unnecessary—in Conflict, he called for “a friendship, that misunderstandings have alienated, to be restored.”28

As Numbers has already explained, though—and as we have already begun to detail here—that is precisely the opposite of what actually happened.

Of Popes and Unicorns

We are left, then, with quite a few questions. If these two men intended to reconcile God and science, then how did they manage to make such a big mess of it? If they were both smart, well read, and respected, then how did they come to write such error-strewn manuscripts? If their books were indeed packed with fables and misinformation, then how did they ever gain such a firm foothold among the educated upon their release?

We can keep going: why does the conflict thesis hold such a strong grip now, more than a century after Conflict and Warfare were written? How is it that so many elite academics—think Kroto and Coyne, for instance—continue to endorse their wrongheaded ideas? If they were so famous and successful at the time, then why has hardly anyone today even heard of Draper, or White, or their two books? And, for that matter, what actually is the truth about science and religion across history?

These questions can lead to still more: Why did so many prominent scientists—Kepler, Boyle, Faraday, Maxwell—attribute their scientific advances so directly to their faith? Why are the New Atheists so sure that the two are incompatible? What makes historians of science, like Numbers, say that the conflict thesis is drivel? Do we have to choose between God and science, or not?

This book, then, will be the story of two books. We shall look at their authors, and at the world that they lived in. We shall look at their content, and at where they went wrong. We shall discover their extraordinary influence as we see their dubious claims repeated, again and again, in the work of countless commentators nearer our own time—commentators, by the way, who should really know better. We will correct their false narratives. We will chase down the truth.

This book is the story of flat earths; of dissection, of autopsies, and anesthetic; of creation and evolution; of laser-eyed lizards and infinite worlds; of miracles and of man-made mountains; of gods; of popes and unicorns. It is the story of comets and mathematicians; of souls and libraries; of the Greeks, the Enlightenment, and the Not-So-Dark-After-All Ages. It is the story of Galileo, of hot dates on spacecraft, and of immortal peacocks. It will be a journey through time, through cultures, through ideas, through personalities, and—ultimately—through the human condition.

And, perhaps, it will result in what we are all, ultimately, looking for: no more Conflict; no more Warfare.

Previous
Page
Next
Page

Contents

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!