One
The Tiger’s Eye
MY LIFE SHRANK SLOWLY—UNTIL ONE DAY IT BECAME too small.
I used to feel that, in stitching me together, the Universe had rather gone out of its way. To know that my atomic makeup was forged in the furnaces of the primal starfield and that my childhood geography was shaped by glaciers dragging themselves north through Wisconsin for centuries—well, it all suggested a lot of effort. I felt a duty to make that work count, to change history for the better, to “bestride the narrow world like a Colossus”—or, at least, to do more than watch epic amounts of television.
Under the influence of the 1980s band Survivor, I adopted the Eye of the Tiger. I would change the world—and I’d enjoy doing it. Yet one day I woke to find that carpe diem! had been replaced by “How many emails can I answer before noon?”
It happened so gradually. When at last I noticed, I looked around with a shock to find that my life had taken on a certain indoor quality—comfortable but quiet and air-conditioned. Perhaps it was just a case of “becoming a realist with a mortgage,” but it felt more deflating.
Technology did not cause this lifestyle contraction, but it served as means and enabler. I was the product of a revolution—in communication, media, and practically everything else—that had driven us all indoors during the past century. Even in a “friendly” midwestern town of immaculate houses, I didn’t meet my neighbor for 3 years. Our encounter required another intervention of the Universe: a freak blizzard that drew us all outside to clear several feet of snow from our driveways. My neighbor had until then existed for me as a head inside a Lexus, but he now burst forth in triumph from his garage astride the largest yellow snowblower I have ever seen. The winter’s disruption sparked some dormant sense of community, and my neighbor offered to clear my drive. But we spoke only for a moment; the snarl of the machine prevented further connection. I went back inside, pleased with the help but wondering if life might not be better had I shoveled alongside my neighbor, in amiable conversation, for an hour.
The Internet allowed me to write from home, and I made a fair living as a technology journalist whose commute could be as short as sitting up in bed to crack open the laptop. There was peace and freedom in this, and there was limitation. It required effort to make new friends; it required effort simply to escape the house. After a decade working from home, my days were spent before computers, my nights spent before televisions, and my free moments spent itching to unlock a tablet or mobile phone.
My grandparents had farmed the fertile fields of Iowa, slaughtered hogs, and sold Ethan Allen furniture. They lived into their nineties without ever watching more than Wheel of Fortune and the evening news. They had little need for screens, yet my life had become screens. Without them, I could no longer read the paper, write a letter, play a game, talk to friends, work a job, check the weather, make a to-do list, read a novel, listen to music, watch a movie, or take a photograph. The world’s tactile richness was reduced to polished pieces of glass.
Atoms, galaxies—the Universe is ever restless. Yet there I was, day after comfortable day, sitting still. I sat on couches, beds, and chairs. Some days I mixed things up by moving from the couch in the basement to the couch in the living room or from the chair at my desk to the chair in my bedroom. I became a champion seat-sitter, a connoisseur of the sedentary, the kind of person with strong opinions about lumbar support and chair height. The gap between the raw activity of my grandparents’ life and my own tap-click existence had become a canyon.
While doing all that sitting, I lost the ability to say “no” to words. Each day I moved my eye across tens of thousands of them—hundreds of emails, dozens of news articles, several long-form magazine pieces, a continuous stream of text chat from my colleagues. I was living what, as long ago as 2009, researchers at the University of California–San Diego had described: “Americans consumed information for about 1.3 trillion hours, an average of almost 12 hours per day,” which corresponded to “100,500 words and 34 gigabytes for an average person on an average day.” This seemed unbelievable, but not by much.
An alarming number of my dinner comments began with the phrase, “I was just reading an article about that . . .” As a former graduate student in English Lit, I felt terrible about this because my growing information inhalation coincided with an inability to get through books. Yet I had always loved books. My shelves buckled beneath the weight of P.G. Wodehouse, Graham Swift, and W.G. Sebald, yet much of the material I spent my time reading each day was two steps above a Buzzfeed “Which Hogwarts house are you?” quiz. Words had become a fire hose, blasting me constantly in the face. They were exhilarating but exhausting; when you need a drink, a single glass of clean water offers more refreshment.
At the end of a day spent among those 100,500 words, I felt incapable of sustained focus. My screens then offered their own remedy: an unending stream of video content. This came with costs, including an even stronger sense of information overload. In the bad old days, the live TV schedule had simply washed over me; I felt no responsibility for it. Now, watching a show was a matter of choice, and the completist in me had to watch every episode. I began to dread multi-season shows. “Too much of a commitment!” I thought. As soon as one show was complete, ten others jostled to take its place.
Still, sustained watching was easier than sustained reading, and I feasted like a vegetarian in a tofu factory. I streamed every episode of Lost and spent the next year telling my wife that anyone who wrote for the final three seasons should be banned from Hollywood. I binged on complete runs of Friends, Frasier, and 30 Rock (twice). I even watched Gordon Ramsay shout his way through four seasons of the “reality” show Hell’s Kitchen, an act for which I am still doing penance.
I wallowed in more material than I could work through in a dozen lifetimes: 40-hour video games, board-gaming podcasts, and Bob Dylan albums; iPad apps to try, Kindle books to read, and Epicurious recipes to bake; British murder-mystery TV shows, interactive New York Times explainers, audiobook versions of The Odyssey narrated by Ian McKellen. The new technologies of abundance stood ready to engage my attention anytime, anywhere, on any device.
But my attention, the searchlight of consciousness, also came under assault by new technologies of interruption: text messages, group chats, Slack messages, emails, and app notifications. It reached a point where I could go only minutes without something on my computer or mobile phone beeping, buzzing, or intruding itself. And I didn’t turn any of this off, because it felt good to be given this succession of tiny gifts, each one a surprise. In a world where people checked their phones 80 times a day—or every 12 waking minutes, on average—the simple act of focus became countercultural.
Looked at in one way, I was living the dream: My life demanded little physical exertion, it required no risks, and it piped endless information and amusement right to my eyeballs. It was, in that favorite word of Silicon Valley CEOs, “frictionless.”
But seen from another angle, I had tap-clicked my way into a lifestyle of comfort, abundance, and immobility—and found it intolerable. Perhaps you have felt the same discomfort, looking up from yet another spam email to wonder: What has become of the wonder and danger of life? What has become, in other words, of our collective Eye of the Tiger?
An Unexpected German
Back in college, I had read—or, to be strictly accurate, passed my eyes across the pages of—a fair bit of German philosophy. I found it . . . not quite to my taste. Reading it taught me three things:
1. If you’re going to doze off during lectures about Feuerbach, don’t sit in the front row.
2. Schopenhauer is a huge bummer.
3. Plowing through both Kant and Hegel in the same week is like being raked across the face by barbed wire.
These negative experiences, which were goosed by additional reading in (shudder) Heidegger and (twitch) Adorno, gave me solid justification for avoiding someone like Friedrich Nietzsche. My life had enough problems without German philosophy, which was often stupefying even when its authors hadn’t gone insane.
Yet Nietzsche’s books had such delicious names—and his pithy quotes showed up in such a surprising number of Etsy craft shops—that I couldn’t quite dismiss him. Kant wrote the impenetrable Critique of Pure Reason, while Hegel penned the impenetrable Phenomenology of Mind—but Nietzsche wrote books with cheerfully reassuring titles like The Gay Science and Daybreak. And Nietzsche was the kind of bomb-thrower who could write “I am no man, I am dynamite!”—and mean it.
So during one of those aspirational moments when I put down the laptop and picked up a book, intent on Improving My Mind, I cracked open Nietzsche’s late work Twilight of the Idols. It seemed a promising place to begin.
For one thing, it was written in the last months of Nietzsche’s sane life, and it functions as a sort of “greatest-hits” record of his philosophy. As everyone who ever listened to 1980s hair metal knows, you desperately want to avoid most bands’ full albums, which are stuffed with filler. On the other hand, the greatest-hits record collects all those power ballads you actually want to hear. Given Nietzsche’s prodigious mustache and wavy locks, I figured that what was true of hair metal might also apply to nineteenth-century German philosophy.
Second, Twilight of the Idols is short. Do not undersell the power of brevity. For instance, when he was a young man, my father once shanked a tee shot so badly that it passed through a stand of pine trees and hit another golfer on the posterior just as the unfortunate fellow bent over to line up a putt. My Uncle Jack, who related this story with gusto at every family gathering I ever attended, eventually whittled it down to a single look—pop-eyed with a pursed mouth and arched eyebrows—that stood in for the surprised golfer at the moment of impact. This look could be deployed without narrating a single word of the broader tale, and the family would reliably dissolve in laughter.
Third, Twilight of the Idols bears the undeniably terrific subtitle How to Philosophize with a Hammer. Dispel any vision of smashing away with sledgehammers, however; the image is of a mallet tapped like a tuning fork against the great idols of our culture. Will they ring out with a clear, sweet note or with “that famous hollow sound which speaks of inflated bowels”? Nietzsche’s hammer sounds the common wisdom of the human herd and finds it wanting—especially the pervasive idea that “the good life” is “the easy life.”
So I dove in and read, on the very first page, “If we possess our why of life we can put up with any how. Man does not strive after happiness; only the Englishman does that.”
Utilitarians such as British philosopher John Stuart Mill were Nietzsche’s target. Nietzsche objected to Mill’s ethical emphasis on providing the greatest good to the greatest number, which seemed to him like an argument for making the most possible people “happy.” But Nietzsche’s own recipe for happiness lay in not having to be “happy” all the time. Ease, comfort, pleasure—they are all fine as far as they go, but they are certainly not life’s point; creative exertion, even struggle, makes life matter. Every revolutionary, from Jesus to Che, can tell you that.
This did not sound insane. Indeed, it was terrific stuff, and what’s more—Nietzsche could write. (He is often listed as one of the greatest German prose stylists after Luther and Goethe. Indeed, he says this of himself.) From across a gap 130 years wide, Nietzsche called out to me.
I raced through Twilight of the Idols. Next up was On the Genealogy of Morals, a three-essay set that was provocative in the best way, and The Anti-Christ, which was provocative in the worst. Then came Ecce Homo, the autobiography finished weeks before madness clouded Nietzsche’s mind; it features chapter titles like “Why I Write Such Great Books” And “Why I Am So Clever.” Next came the Gospels-meet-self-help mashup Thus Spoke Zarathustra, which was exciting enough that the German government issued an edition to soldiers during World War I. Then came The Birth of Tragedy, a piece of fascinating philosophical propaganda favoring some very specific Greek tragedians while blaming Socrates for sending Western culture off the rails. I plowed through the aphoristic works Beyond Good and Evil, Daybreak, The Gay Science, and Human, All Too Human. (Do not read them, as I did, straight through.) Finally, I dove into the Untimely Meditations, in which the young Nietzsche argued passionately that knowledge was not a good in itself; it was only good if it helped us live. This was relevant, as my browser history reminded me that I had spent half an hour that morning clicking through articles on myxomatosis. I have never owned a rabbit.
I stopped haranguing my wife about the failures of the Lost writers’ room, and I started haranguing her about Things Nietzsche Had to Say On Whatever Topic You Just Mentioned. She endured it with long-suffering generosity.
I had plenty of material for these uninvited harangues because Nietzsche made pithy comments about everything, including God (dead), morality (immoral), and his own abilities (amazing). He had thoughts on the early Wagner (sublime), the later Wagner (a chump), enemies (valuable), suicide (freedom), and sitting still (the real sin against the Holy Spirit). He loved evolution but hated Darwin; he hated Christianity but put up with Jesus.
And he had the most fascinating, terrible life.
Nietzsche was born in the town of Röcken, Prussia, in 1844. His father, Carl Ludwig, was a Lutheran pastor who died 5 years later at the age of 36. His death was due to an unspecified “brain” issue; Nietzsche would always wonder if madness or illness ran in the family line.
Young Friedrich was raised in a household of women that included his sister, Elisabeth, his mother, a grandmother, and a pair of aunts. When he gained a scholarship to the famous boarding school Schulpforta in 1858, he moved away from home for most of the year and embarked on an old-fashioned educational program that was infamous for its rigor. As a 14-year-old, Nietzsche began his Schulpforta lessons daily at 6:00 a.m. and was encouraged to speak to the other boys in Latin and Greek.
Nietzsche’s own health, even as a teenager, was never good. His name is listed 20 times in the Schulpforta illness log between 1859 and 1864, and Nietzsche required a week on average to recover from each episode. Treatments at the time, despite the school’s international reputation for excellence, remained as retrograde as some of the teaching methods. “At Pforta they were treating Nietzsche’s ghastly episodes of chronic illness, his blinding headaches, suppurating ears, ‘stomach catarrh,’ vomiting, and nausea with humiliating remedies,” writes biographer Sue Prideaux. “He was put to bed in a darkened room with leeches fastened to his earlobes to suck blood from his head. Sometimes they were also applied to his neck.” Nietzsche suffered through. Though the school doctor suggested that blindness lay ahead, Nietzsche wore dark glasses to protect his aching eyes and continued his studies.
Schulpforta was not a “fun” place for a child to learn, especially one not into the culture of sports, hikes, and patriotism, but it did give Nietzsche a solid educational grounding in Hebrew, Greek, Latin, theology, and German literature. He took his degree and went off to university at Bonn, where he drank beer with a proto-fraternity called Franconia, gained a small dueling scar on the bridge of his nose, and lost his Christian faith. Nietzsche’s discipline was philology, which at the time was a mix of history, literature, and languages, and he wrote long papers on obscure Greek topics. When an influential professor of his left Bonn for Leipzig, Nietzsche realized what a waste his sabre duels and beer drinking in Bonn had been, and he headed for Leipzig, too, to get serious about philology.
This proved a wise choice. Nietzsche so excelled at his work that his professor recommended him in 1869 for the open chair of philology at the university in Basel, Switzerland—and Basel made Nietzsche an offer. Nietzsche was just 24, the youngest man ever to be offered a teaching post at the school. He hadn’t even earned his degree yet. With a prestigious job offer already in hand, however, Leipzig granted Nietzsche his doctorate in March 1869 and shipped him off to Switzerland.
Despite his own bouts of ill health, Nietzsche took a leave of absence to serve as a medical orderly in the 1870 Franco-Prussian War, where he promptly contracted typhoid and diphtheria. He was obviously a genius, but he could never bring himself to write as a professor “should,” and his 1872 oddity The Birth of Tragedy was not the first book expected of a promising scholar. It was an embarrassment to the scholastic guild, and few students signed up for Nietzsche’s seminars after it appeared. His migraines, vomiting, eye pain, and insomnia grew so bad that he relied on friends and family to read to him and to take his dictation; he missed long periods of class and walked around Basel with tinted spectacles and a green eyeshade. He began to feel that philology was sapping his strength—he had been pushed too early into the herd of “scholarly oxen.”
Out of this crisis came Human, All Too Human in 1878. It was a highly readable work, dense with gnomic aphorisms that forced readers to think through issues along with Nietzsche rather than lapping up spoon-fed conclusions. The book explored the ways in which all-too-human needs and experiences lay behind most transcendental ideals. It had nothing to do with Nietzsche’s nominal discipline of philology; his colleagues wondered what the young professor thought he was doing.
One of the ideals Nietzsche came to loathe was ease. He saw politicians of his day promising a comfortable life to everyone. But at what cost?
If the enduring homeland of this good life, the perfect state, were really achieved, it would destroy the earth from which a man of great intellect, or any powerful individual, grows: I mean great energy. When this state is achieved, mankind would have become too feeble to produce genius any longer. Should we not therefore wish that life retain its violent character, and that wild strengths and energies be called forth over and over again?
Nietzsche’s shadow side emerges here—his characteristic redemption of violent impulses and savage energy as the motive force behind full-fledged human excellence. The ways in which he puts these views can veer quickly from “provocative” to “unsettling.” It is also clear that he was no democrat. His interest was less in the mass, the commoner, the herd than it was in the individual, the genius, the exception. We will have to peer more carefully into these attitudes later, but the point is clear enough: The good life is not the enervating comfort of respectability and ease.
Nietzsche himself gave up such a life the next year. In 1879, under the pressure of illness and despair, he resigned his chair in Basel and became an intellectual vagabond on a modest pension. He sold most of his things. Hauling his few trunks of belongings between the homes of friends and the more affordable sort of European guesthouse, Nietzsche never worked a “job” again. Instead, he sought places where he might feel well, scribbling thoughts into notebooks during long walks in the Alps and along the Italian coast. Aphorisms suited him, because he could work on them between attacks that might leave him retching and helpless for days. Doctors continued to warn Nietzsche that blindness was a likely result of his tremendous eye pain; they counseled him to stop reading and writing. He could limit his reading, but writing had a hold on him that no doctor could break.
Nietzsche published Daybreak in 1881, refining his mastery of the aphorism as he wondered what kind of humans might be produced by a lifetime of comfort. “Are we not,” he asks in the book, “with this tremendous objective of obliterating all the sharp edges of life, well on the way to turning mankind into sand? Sand! Small, soft, round, unending sand!”
Looking at his contemporaries, Nietzsche saw an almost religious obsession with ease. Comfort had become an undisputed good, even for many German Christians. In 1882, a year in which he pursued a one-sided romance that proved both farcical and devastating, Nietzsche published The Gay Science and again preached against this attitude:
If you experience suffering and displeasure as evil, hateful, worthy of annihilation, and as a defect of existence, then it is clear that besides your religion of pity you also harbor another religion in your heart that is perhaps the mother of the religion of pity: the religion of comfortableness. How little you know of human happiness, you comfortable and benevolent people!
There wasn’t much comfortableness for Nietzsche. His health continued poorly, almost no one read his books, he had no job, and his sister, Elisabeth—whom he playfully called “Llama”—got engaged to an Aryan supremacist named Bernhard Förster. Nietzsche hated bigots, and so he hated Förster; when Elisabeth married him in 1885, Nietzsche did not attend the wedding. The newlyweds moved to Paraguay, where they oversaw an explicitly racist colony called Nueva Germania. Nietzsche cut off contact.
He still wrote furiously, putting out a major project a year despite his ever-shifting maladies, but his vaunted productivity came to look increasingly like mania. In late 1888, when Nietzsche claimed to feel a surge of good health and high spirits, he wrote not one work but four, each increasingly visceral and erratic. Weeks after finishing his self-justifying autobiography Ecce Homo, which argues that he was one of the greatest human beings to ever live, Nietzsche mailed a set of extremely odd Christmas letters. Many of them stressed that he was about to become the most famous person on Earth.
On January 3, 1889, just 44 years old, he collapsed in the winter sunshine of a Turin piazza. (The story—perhaps apocryphal—goes that he saw a local cabman whipping a horse, and Nietzsche threw himself in front of the whip and then fell to the ground.) After he was taken back to his rented rooms, Nietzsche was never truly lucid again. A friend from his Basel days came down to Italy and took Nietzsche home by train. Nietzsche spent the following 11 years in obvious mental agony, both in a Jena asylum and at his home. His sister returned from Paraguay and eventually gained control over both her brother’s body and his works. Elisabeth created the Nietzsche Archive to raise her brother’s stature, and she published a Nietzsche biography with obvious inaccuracies. She collected and controlled access to his letters, then published The Will to Power from material left in Nietzsche’s notebooks. She began to host literary salons in the family home even while her brother groaned and roared upstairs. During this long period of insanity, Nietzsche became famous; his books sold across Europe and money poured in. He knew nothing of it.
Nietzsche finally died in August 1900. Elisabeth, who continued to distort his legacy for decades, survived until 1935, helping to turn Nietzsche’s thought into a kind of proto-Nazi ideology of domination. Adolf Hitler attended her funeral.
Nietzsche’s life was often sad, sometimes grotesque, but his undeniable talent was forging genius from his own pain.
As an academic prodigy, Nietzsche looked set at 24 for a comfortable life in a comfortable town, well paid and well respected, his schedule regulated like a Swiss clock. Over the coming decades, he planned to tunnel through an avalanche of words, excavating a scholarly path forward with help from the library and the bookshop. His labor would be mental and verbal, largely sedentary, the interiority broken only by weekend train trips into the country to visit Richard Wagner or to climb a local peak.
But thanks to both temperament and biology, Nietzsche turned his back on these plans. His personality found little joy in the academic world he had proved so gifted at inhabiting, while his subterranean illness burst forth so often that the “religion of comfortableness” was not a live option. If life’s true joys could only be found in hedonism and control, then Nietzsche was doomed, for he could experience neither.
As a professor, Nietzsche felt crushed beneath the weight of information overload. He thought endlessly about his relation to what we might today call “content.” He sought freedom—and his own voice—through an embrace of rereading, restriction, and forgetting.
Nietzsche broke free of his job and reputation when he left Basel after a decade, but he also broke free of the regimented interiority of so much knowledge work. He engaged the world with his body and not just with his rational mind, learning to think in powerful new ways that relied on movement, emotion, and desire. Despite ill health, Nietzsche wrote more passionately about joy than has any other philosopher I have ever read. In all three of these ways, Nietzsche felt sharply and surprisingly relevant to me, and his diagnoses of his own problems sometimes seemed like diagnoses of my own.
The Second-to-Last Man
In 1883, Nietzsche made his most extended pitch against the easy life, and he did it in the form of a parable.
Thus Spoke Zarathustra has become Nietzsche’s most famous book. In it, the ancient prophet Zarathustra descends from his mountain cave. (Zarathustra is simply another name for Zoroaster, the Persian sage behind the ancient Zoroastrian religion.) In language heavy with biblical allusions, he attempts to interest the local villagers in his philosophy. It . . . does not go well.
It is time for humanity to set itself a new goal, Zarathustra tells the befuddled villagers.
The time has come for man to plant the seed of his highest hope. His soil is still rich enough. But one day this soil will be poor and domesticated, and no tall tree will be able to grow in it. Alas, the time is coming when man will no longer shoot the arrow of his longing beyond man, and the string of his bow will have forgotten how to whir! I say unto you: one must still have chaos in oneself to be able to give birth to a dancing star. I say unto you: you still have chaos in yourselves!
Zarathustra wants to call forth this productive chaos. Only through motion and power and risk can one find the energy to become something new.
Now, Nietzsche has a reputation as a negative thinker; he’s counted as one of the great “Masters of Suspicion” along with Marx and Freud. Everything that a culture takes as obvious—that religion is good, for instance, or that pity is noble—Nietzsche wants to interrogate. His answers are often the inverse of society’s valuation, which is why he calls his project the “Revaluation of All Values.”
But Nietzsche’s ultimate goal is positive. It involves saying a great Yes to life even in the absence of transcendent meaning or purpose. He preaches—and “preaching” is certainly the word for much of this—the best possible version of humanity. This new kind of human is referred to as the übermensch, a word variously translated as “overman” or “beyond-man” or (unfortunately, once Clark Kent appeared) “superman.” It is this übermensch that Zarathustra preaches to the village.
But this is not the inevitable future of individual and collective humanity. Nietzsche thinks it equally likely that humans will retreat from the difficult path forward and take refuge in the “most contemptible” form of being, which he calls the “last man.” This way of life values safety, ease, health—values that can come at the cost of striving, risking, and creating. It is the sort of life in which even one’s vices are mundane. Grand passions have been extinguished—one cannot even fight well anymore!—and people indulge their weaknesses rather than discipline them. The result is a search for something contemptible: comfortable contentment.
“Nietzsche regards the failure to draw a distinction between happiness and contentment as especially disastrous,” writes Nietzsche scholar Michael Tanner. “For him, the only happiness worth having is that which is the by-product of strenuous efforts in various directions, efforts undertaken without a thought for the happiness they might produce.”
Zarathustra therefore warns the villagers:
The earth has become small, and on it hops the last man, who makes everything small. His race is as ineradicable as the flea-beetle; the last man lives longest. “We have invented happiness,” say the last men, and they blink . . .
One still works, for work is a form of entertainment. But one is careful lest the entertainment be too harrowing. One no longer becomes poor or rich: both require too much exertion . . .
One is clever and knows everything that has ever happened: so there is no end of derision. One still quarrels, but one is soon reconciled—else it might spoil the digestion. One has one’s little pleasure for the day and one’s little pleasure for the night: but one has a regard for health.
One of the curiosities of Zarathustra’s argument is that the “last man” is not a race of cosseted weaklings that will simply die out. No, the lifestyle of the “last man” works. Its emphasis on safety, society, health, and ease produces a class of beings that, like beetles, are almost ineradicable. But who wants to be a beetle?
The crowd does.
“Give us this last man, O Zarathustra,” the villagers shout back at him. “Turn us into these last men!”
They are not wrong to want such things. To those who have labored without ease or security or those who have lived in isolation or with illness, the goods of the “last men” are worth pursuing. But I think Nietzsche is right to suggest that perhaps these goods are only way stations on the human journey, not the terminus.
As I read this passage, I recognized that my life and longings had indeed become cramped, degenerating into a quest for mere “happiness.” I worked but did not strive; I read but only became clever; I regarded my health to the point of neuroticism. Had I become, staring at my screens day and night, one of those who heard about higher and harder things and simply “blinked”? Was I like the villagers, who heard Zarathustra’s “despicable” description of the last man—and loved it?
If not yet the “last man,” then I was something close. Call me the “second-to-last man.”
Too Much to Know
We live, as the poet W.H. Auden noted years ago, in The Age of Anxiety. One recent symptom: the profusion of weighted blankets—an Amazon search brings 3,000 results—that offer relaxation beneath their comforting constriction.
Imagine yourself in bed beneath an especially heavy version of one such blanket. You wish to rise up to carpe the diem, but you can’t even raise an arm. The blanket does not weigh enough to smother, but it cannot be kicked free, shaken loose, or shuffled out of. It traps you, comfortably immobile, in the softest possible prison.
For many of us, this is how “information overload” feels. It is a kind of smothering—not a spur to action but an inhibition of it. Conditioned by science, the Enlightenment, and a commonsense idea that “knowing what you’re doing” is better than “not knowing what you’re doing,” many modern cultures fetishize information. Does anyone today feel dumber than when they make a mistake for which they could have just Googled the answer?
Any attempt to get up off the couch of the “last man” and take risky action in the world must confront the challenge of information, which today threatens to overwhelm, distract, and derail us. “Information has become a form of garbage,” writes the communications professor and critic Neil Postman in his 1992 book, Technopoly, “not only incapable of answering the most fundamental human questions but barely useful in providing coherent direction to the solution of even mundane problems.”
Technology made this possible. It gave us the scroll and its linen, the codex and its vellum, the press and its paper, the smartphone and its screen. With each innovation, the past piled up around us. Consider the sheer volume of the information that fills our mental rooms: Text messages. Podcasts. Snapchat. Netflix. Facebook. Instant messages. Emails. New books—hundreds of thousands a year. Old books—Google has digitized the contents of entire university libraries. Fifty million albums of streaming music. Every magazine article ever published. The 6 million articles (just in English!) on Wikipedia. YouTube. The New York Times—and its massive archive. Every other major world newspaper. The entire World Wide Web. And so little of it is necessary to live. Without some criteria for using, sifting, and shaping it, our information-rich past and present may simply press on us until we live like hoarders, surrounded by detritus but treating it like treasure.
Even ancient authors complained about too many books and not enough time. But the issue became acute during the Renaissance. As Harvard historian Ann Blair notes in her book Too Much to Know, “The discovery of new worlds, the recovery of ancient texts, and the proliferation of printed books” made Renaissance knowledge into something truly overwhelming.
Perhaps it’s no surprise, then, that thinkers such as René “Cogito Ergo Sum” Descartes tried to burn the past down, building up a philosophy using only the universal human experience of conscious thought. Why spend years laboring through Latin declensions, the Greek alphabet, or the Hebrew hiphil when you could ground everything in “I think, therefore I am”? No books, no tongues, no God—not yet, anyway. Just a man, thinking his way through a winter night in 1619.
Descartes’s own frustration with attempts to wring the world’s information from its store of books is palpable. “Even if all knowledge could be found in books,” he wrote, “where it is mixed in with so many useless things and confusingly heaped in such large volumes, it would take longer to read those books than we have to live in this life and more effort to select the useful things than to find them oneself.” And this even though Descartes didn’t have TikTok videos to distract him.
Things only got worse after Descartes’s day, and Nietzsche felt the acute burden of too much information. He saw clearly how one’s life could be spent exploring the most distant tributaries of knowledge. As a philologist, he knew that mastering the ancient Greeks was a lifetime task—to say nothing of everything written since.
After a decade spent teaching the youth of Basel, both at the university and at the local Pädagogium, Nietzsche was fed up. What mind could think under the pressure of everything that one “should” read, see, or hear?
“The sum of sensations, knowledge, and experiences, the whole burden of culture, therefore, has become so great that an overstraining of nerves and powers of thought is a common danger,” he complains in Human, All Too Human, a book that he said was dictated with his head bandaged and in pain. “A diminution of that tension of feeling, of that oppressive burden of culture, is needful, which, even though it might be bought at a heavy sacrifice, would at least give us room for the great hope of a new Renaissance.”
Reflecting on this period in his later autobiography, Nietzsche saw himself in thrall to a cramped and useless approach to knowledge. He and his fellow “scholarly oxen” spent their lives like ruminants who grazed only on books, but to no life-giving purpose. After a decade of this work, what had been mastered? Fragments of ancient drama and philosophy that gave few answers to Nietzsche’s own questions—or to his own suffering.
“Ten years behind me during which the nourishment of my spirit had quite literally been at a stop, during which I had learned nothing useful, during which I had forgotten inordinately much over a trash of dusty scholarship,” Nietzsche complained.
Creeping meticulously and with bad eyesight through antique metrists—that is what I had come to!—I was moved to compassion when I saw myself quite thin, quite wasted away: realities were altogether lacking in my knowledge, and the “idealities” were worth damn all!
What Nietzsche came to see, in the futility of his quest to master knowledge, was that modern people need (as we might put it today) a practical theory of information management. This is not a luxury for eggheads; it is an important component of a goal-directed life. Nietzsche tackled the issue in his early essay “On the Uses and Disadvantages of History for Life.” One feels the personal anguish radiating through his introduction, where Nietzsche wonders if too much time given to study might not cripple one’s life. He senses this intuitively, but he fears being ungrateful to truly impressive gains in knowledge collection and transmission since the Renaissance; after all, one does not want knowledge and culture destroyed. Yet he cannot shake his “tormenting feelings” that overstuffing one’s life with information might be “injurious to it, a defect and a deficiency.”
Nietzsche concludes that “instruction without invigoration” and “knowledge not attended by action” is a luxurious waste. It. It is to learn how to live. And then must be “hated by us.” The point of life—even a scholarly life—is not to get through content; it is not to master materialto actually live.
Looking around at philology, Nietzsche concluded that much of our study of the past was frivolous. “We need history, certainly,” he wrote, but:
We want to serve history only to the extent that history serves life: for it is possible to value the study of history to such a degree that life becomes stunted and degenerate—a phenomenon we are now forced to acknowledge, painful though this may be, in the face of certain striking symptoms of our age.
Such “striking symptoms” of information disorder are easy to spot in our own age. Ask anyone who plops into an armchair after dinner, phone in hand, hoping to vegetate for 10 minutes—but who looks up to find an hour has passed. As a one-off event, this may simply be “human, all too human” of us; as a regular occurrence, it looks pathological. If, as Annie Dillard once wrote, the way we spend our days is the way we spend our lives, then many of us are spending our lives consuming content of fairly dubious long-term value.
For me, it was streaming video. Not because I was watching an unusual amount of it—compared to most Americans, at least, I watched moderate amounts—but because it became both an uncomfortable comfort and a beautiful burden. It became so easy to watch “one more episode” that I often awoke with a start at 3:00 a.m., finding that I had fallen asleep on the couch during one too many episodes of Breaking Bad. Stumbling upstairs with foul breath and a foggy mind, I did not feel liberated by technology. Yet when the next night came, I did it again.
Watching TV in the bad old days, when you had to take whatever was on and put up with advertising breaks, had at least this virtue: It could wash past you. It was, by its nature, fleeting. But streaming TV meant the cavalcade of comforting, ad-free images never quite came to an end. The next episode awaited; a new “must watch” show always beckoned. The more I watched, the further away the end of my queue looked. I felt active anxiety about adding shows that ran for more than three seasons to my watchlist; who could handle 150 more episodes of anything?
Was this abundant life—or was it “stunted and degenerate”?
Trying to “keep up” with culture has always been hard; in the Information Age, it is a delusion. In the end, our best attempts to master information make us mere tourists in the kingdom of knowledge. In The Gay Science, Nietzsche reckoned with his own desire to keep up with information. He faced the truth that this is already impossible. In fact, keeping up grows increasingly impossible with every new piece of knowledge or culture.
“Perhaps we philosophers, all of us, are badly placed at present with regard to knowledge,” he writes. “Science is growing; the most learned of us are on the point of discovering that we know too little. But it would be worse still if it were otherwise—if we knew too much; our duty is and remains, first of all, not to get into confusion about ourselves.”
That’s the heart of the approach that Nietzsche develops toward knowledge: Know yourself. Know your limits and embrace them. Reject the burden of any information that does not contribute to living your life. That means setting aside huge swaths of content. Nietzsche’s theory of information management emphasizes slow reading, rereading, not reading—even forgetting. To truly live, we must be willing not to know many things.
Separating the information that gives life from the information that kills became Nietzsche’s obsession in the years after his eyes failed him and he stepped back from the academic life. His eye pain and migraines eventually “put an end to all book-wormishness,” he wrote later. “For years I ceased from reading, and this was the greatest boon I ever conferred upon myself!”
For those schooled in societies that value education to an almost pathological degree, this can be difficult. It was difficult for Nietzsche, too, whose identity had been so tied to his scholarship. But in his enforced info-cocoon, and the physical pain that accompanied it, Nietzsche found his mind at work on new problems that meant something to him. He found, in a word, himself.
“That nethermost self, which was, as it were, entombed, and which had grown dumb because it had been forced to listen perpetually to other selves (for that is what reading means!), slowly awakened; at first it was shy and doubtful, but at last it spoke again,” he later recalled. “Never have I rejoiced more over my condition than during the sickest and most painful moments of my life.”
In the section labeled “On Scholars” in Thus Spoke Zarathustra, Nietzsche reflects on the academic identity he left behind in Basel.
I moved out of the house of the scholars, and I even slammed the door behind me. Too long did my soul sit hungry at their table; I am not, as they are, trained to pursue understanding as a kind of nut-cracking. . . . I am too hot and burned by my own thoughts: often it almost takes my breath away . . . but they sit coolly in the cool shade: they want in all things to be mere spectators and are wary of sitting where the sun burns down upon the steps. Like those who stand in the street and gape at the people passing by, they too wait and gape at thoughts that others have had.
This is Nietzsche in a nutshell: the call to stop sitting in the shade, to stop being a spectator to one’s own life, to think, to speak, to live. It rang true to my own experience, as I had left a graduate program in literature some years before for the same reasons. My experience began in the joy of great books, but it ended among seminars on the daily doings and religious politics of the Long Parliament during the English Civil War of the 1640s. I was pursuing knowledge as if it were nutcracking, digging out the nutmeats to form a tiny hoard of high-protein esoterica. Out of it I could cobble together papers on Renaissance history, theology, or politics—occasionally even on literature. Professors argued whether any of this should even have a point or whether we studied the past simply for its own sake. My soul sat hungry; after 4 years, I too left their house with the door banging behind me.
The position is not anti-intellectual. Even in the years of his solitary wanderings around Europe on a minimal pension, Nietzsche lugged a massive trunk that he called the “clubfoot”—filled entirely with those books he could not bear to be without. But his approach is intensely realistic about the frailties of our “human, all too human” minds and life spans. It puts information in its proper place, serving life rather than becoming an end in itself.
Sit as Little as Possible
Imagine an English evening in the late 1700s. On his journey home from boarding school, the young William Wordsworth pinches a rowboat. It is tied to a willow tree on the shore of Ullswater in the Lake District. The summer holidays have arrived, night has draped itself upon the landscape, and the stars reflect in the water. The future poet finds himself powerfully tempted; he slips the boat’s halter from the willow and eases out into the lake for a joyride.
He muffles the creak of the oarlocks until far from shore, then strokes hard enough that his tiny craft goes “heaving through the water like a swan.” A craggy hill blocks out everything beyond, but as Wordsworth rows past this pedestrian piece of land, a cliff face towers into view behind. Suddenly, savagely, it blots out the moon. Wordsworth, alone in the dark in a stolen boat, feels a thrill of existential dread—until his phone vibrates against the wood floor of the boat.
Bzzt bzzt.
It’s a text message from Wordsworth’s classmate, Ted Sanderson, who like Wordsworth is trekking home from Hawkshead School for the summer. Wordsworth, on the brink of a mystical experience, ignores the message.
He recenters himself on the boat bench. He releases the oars. He looks up at the sheer cliff face that has been revealed, as though by Providence, to measure out the small size of a human life. He feels the uncaring geologic power brooding in the stone. He is Having A Moment.
Bzzt bzzt.
Messages arrive in bunches, lighting up the phone screen. Each new one draws his eye, no matter how he fights the impulse. Wordsworth looks back at the cliff, then sighs and unlocks the phone.
“Haha! Cropper fell off his pine,” says the first text. “Into a puzzle!”
Who Cropper is, why he was on a pine, and how he fell into a puzzle are not explained.
“Damn autocorrect,” says the second text. “Off his pony. Into a puddle. A PUDDLE!!!!”
Ted has attached a selfie of his pink-cheeked face beside that of the unfortunate Cropper, who appears to be the servant carting him home. Cropper is, as the messages have promised, soaking wet in the mud.
Wordsworth smiles despite himself. He feels the night chill seep through his shirt as he glances back at the cliff. It looks less sublime now, no longer a rocky demon one might storehouse in the mind for years.
Wordsworth paddles frantically back to shore and runs homeward through the meadows without any “grave and serious thoughts.” He suffers nothing from “huge and mighty forms that do not live” but which “like living men moved slowly through my mind / By day, and were the trouble of my dreams.” Wordsworth’s solitary experience upon Ullswater is never immortalized in poetry as
a huge cliff,
As if with voluntary power instinct,
Upreared its head. I struck and struck again,
And, growing still in stature, the huge cliff
Rose up between me and the stars, and still
With measured motion, like a living thing
Strode after me.
Some of our most precious and most delicate mental states can only exist without interruption. Sublimity, religious ecstasy, varieties of cosmic dread, wonder, rapture, even our sexuality: These are altered by the presence of the buzzing smartphone. Can we listen to Gabriel Fauré’s Requiem in D Minor in the same way while receiving a series of texts? Can we see our child’s fourth-grade chorus concert in the same way when watching it through a screen as we record? Can we hike up Mount Katahdin in the same way while constantly capturing the moment in pictures?
We live different lives when our experience of the world is mediated or interrupted. Unfortunately, for all the good they do, our digital technologies are expertly crafted attention grabbers.
Nietzsche was exceptionally sensitive to the need for uninterrupted immersion in our physical existence, and he found suspect anything that removed us too completely from this immersion. Even reason, he came to argue, was tied to our physicality. We are not, and should not try to be, “brains in a vat.” We need to own our embodiment.
“We do not belong to those who have ideas only among books, when stimulated by books,” Nietzsche writes in The Gay Science.
It is our habit to think outdoors—walking, leaping, climbing, dancing, preferably on lonely mountains or near the sea where even the trails become thoughtful. Our first questions about the value of a book, of a human being, or a musical composition are: can they walk? Even more, can they dance? We read rarely, but not worse on that account. How quickly we guess how someone has come by his ideas; whether it was while sitting in front of his inkwell, with a pinched belly, his head bowed low over the paper—in which case we are quickly finished with his book.
Whenever his health allowed, Nietzsche took long summer walks in the mountains around Sils Maria, Switzerland, or long winter walks along the sea in Genoa, Italy. He thought as he walked; he took notes that became his books; he conceived his most uniquely Nietzschean doctrines at specific physical locations. You can still visit the rock on the north side of Lake Silvaplana where Nietzsche discovered the idea of “eternal recurrence.”
“Sit as little as possible,” he wrote weeks before his collapse into insanity. “Give no credence to any thought that was not born outdoors while one moved about freely—in which the muscles are not celebrating a feast, too. . . . The sedentary life—as I have said once before—is the real sin against the holy spirit.”
Simply moving around—challenging enough in these screen-first times—is not enough, however. Nietzsche worries that humans have become such creatures of reason and self-reflection that they are incapable of committing their full attention to the present moment—of inhabiting experience rather than reflecting upon it.
Switching off this reason-heavy self-consciousness is a key concern of his first book, The Birth of Tragedy. It nominally concerns the development of ancient Greek tragedy, but it’s about as far from an “academic” book as San Francisco is from Boston. It opens with a description of two forces that Nietzsche names after the Greek gods Apollo and Dionysus. Apollo, the deity of sunshine and sculpture, is associated with order, control, and reason, “that restraining boundary, that freedom from wilder impulses, that sagacious calm of the sculptor god.” Modern humans have developed too much of the Apollonian temperament, however; they risk cutting themselves off from one another and from nature by a heightened and controlled self-consciousness that can never release itself from its own tension.
Dionysus, in contrast, is associated with wine, emotion, chaos—and a loss of the self-consciousness that divides us from nature and from each other. “Not only is the bond between man and man sealed by the Dionysiac magic,” Nietzsche writes. “Alienated, hostile, or subjugated nature, too, celebrates her reconciliation with her lost son, man.”
One doesn’t want a purely Dionysian experience of life, which would amount to the dissolution of the self, a lack of control over one’s mind and body, and the loss of reason’s gifts. But Nietzsche argues that an overly Apollinated humanity needs more of the Dionysian. “These poor creatures have no idea how blighted and ghostly this ‘sanity’ of theirs sounds when the glowing life of Dionysiac revelers thunders past them,” he writes.
Nietzsche himself was no Dionysian reveler. His poor health meant no alcohol; his isolation meant few parties. Certainly there were no drunken torchlight processions to sacred groves. So far as we can tell, he had sex only a handful of times. But he found in music a release so powerful that he repeatedly championed the power of art to save us from ourselves. While listening to the best of Wagner, he could set down the burden of being himself; he could embrace the ecstatic unity of creation championed by the old religions of Greece and Rome.
This explains Nietzsche’s crank-ish hatred of Socrates. In the world of philosophy, this is a bit like harboring a secret spite toward cupcakes. And yet Nietzsche loathed Socrates. It was the ancient Athenian philosopher who had, Nietzsche said, pushed Apollonian reason to the forefront of philosophy, where it has stayed ever since. Socrates was a “typical decadent” who set “ ‘rationality’ against instinct,” Nietzsche writes. “ ‘Rationality’ at any price as a dangerous force that undermines life.”
Which is to say, simply: Logic cannot give us Wordsworth’s sublime terror in a rowboat. Reason itself becomes self-interruption of and internal commentary upon our own sense experience. Reality becomes too thought about to be felt. There is wisdom in the body that the mind knows nothing of.
Nietzsche calls us to combine the Apollonian and Dionysian tendencies, to restore a balance between controlled rationality and a more ecstatic physical immersion. Does this ring true to us today as we tap our hammers against the idols of our culture, sounding out the false notes?
Many of our most common technologies, designed as they are to capture attention, make sense immersion a challenge. We are complicit in our own captivity. Texts, emails, and social media posts constantly interrupt us because we have allowed them to—and we cannot resist attending when they do so.
Our most popular information technologies exert a subtle, persistent pressure on our psyches to look away from the world before us. Our minds may be pushed forward, anticipating that next text, email, or tweet. They may be pulled elsewhere, staring at Instagram posts from our friends’ apparently more beguiling lives. Or they may simply be distracted by games, sounds, and videos. What information technology does not usually encourage is a relaxed openness to the physically local, the here, the now.
Our screens are now so compelling that we are looking at them while looking at other screens. Nielsen data from 2019 revealed that 88 percent of people surveyed said they used a “second digital device” while watching television. Our screens present us worlds more stimulating and more novel than our own; they have become our tutors in elsewhere-ness.
The “elsewheres” on offer through the Internet can be tremendous gifts. Yet our local reality, even in its tedium and awkwardness, remains the place that matters most for human connection. As MIT researcher Sherry Turkle found in her research into technology use and community, “The ties we form through the Internet are not, in the end, the ties that bind. But they are the ties that preoccupy.” It is worth considering whether our devices do too much to pull us away from the physical spaces we inhabit.
Nietzsche hated all “elsewheres.” He referred to them as “other worlds,” whether offered by Plato (the Forms), Jesus (heaven), or Kant (the thing-in-itself). Anything that offers itself as something more real than the world of appearances we perceive through our senses distracts us from the here, the now, the embodied, the truly real. Elsewheres invariably disguise from us the fact that we are creatures of the earth. Much of Nietzsche’s philosophy was an attempt to find meaning within this given world and its many limitations.
Why Think with Friedrich
“When will Nietzsche explain something useful, like how to do a digital detox over a long holiday weekend and what craft beers I should drink while doing it?” you might be wondering at this point. Or perhaps you have more pointed concerns: “Wasn’t Nietzsche some sort of proto-Nazi?” Call these the practical and the personal critiques. Both are worth engaging.
Pondering the value of an “easy life,” considering ways to restrict our information intake, and finding strategies to increase immersion in local experiences do not sound as immediately useful as “Top Ten Tips for Taming Toxic Tech.” Why are we talking about Dionysus and not a “no screens after 9:00 p.m.” rule? What about a tasteful wicker basket placed beside the front door to collect devices as family members enter?
I have nothing against tasteful wicker baskets, but “practical” advice of this kind is wildly overrated. On a merely empirical level, hand-crafted receptacles have done little to halt our collective smartphone addiction. People pay ludicrous sums of money to go on “digital detox” vacations but immediately check their texts when the weekend’s over. We must think more deeply about what we need from technology, information, and life itself. Only then can we formulate individual approaches that will last.
Case in point: the common claim that we need to “moderate” our screen time. We spend far too much time staring at our screens, the argument goes. It’s unhealthy! Let’s cut back to a reasonable amount. Well—sure. But if you are wasting 20 hours a week on useless screen time, cutting that number back to 10 hours a week means that you are still wasting 10 hours a week. This is certainly better, but it’s hardly a goal in itself. What matters is serving life.
But few of us know ourselves as well as we imagine—in part because we don’t have much time to really think. So we might reclaim that hour before dinner spent reading Reddit posts and instead flip through back issues of the Economist that are busy classing up the coffee table. (Do not underestimate just how virtuous this can feel!) But if we only use our newfound knowledge of Azerbaijani politics and the Scottish National Party to sound “well read,” have we improved either our lives or the world? Or are we simply wasting time in a different way? Perhaps we can relate to Nietzsche’s complaint in Human, All Too Human: “By an excess of effort [workers] win leisure for themselves, and then they can do nothing with it but count the hours until the tale is ended.”
We simply do not know in advance that “moderating” screen time is a good in itself. If you’re a novelist, staring at a monitor for 7 hours a day may be exactly what you need to achieve difficult, creative life goals. You might need more screen time in your life!
Or consider the claim that what we need, in our distracted, information-addled mental states, is “mindfulness.” Nietzsche would love this, right? We are immersing ourselves in the present! But “fully experiencing the present moment” is, on its own, a good but limited value. In yoga classes, for instance, mindful breathing is often a calming and centering practice—which then rejuvenates us to resume our frenetic lives.
Mindfulness by itself does not answer the key questions: What moments should we most seek to be present in? And within those moments, what aspects of reality should we be attending to? And when we attend to them, are there some we must seek to change? These are, in a way, deeply moral questions, and many forms of attention will (or should) prompt us to action.
Alan Jacobs, a humanities professor at Baylor University, penned a provocative 2016 manifesto called “Attending to Technology: Theses for Disputation.” Many of his theses concern the use and misuse of our attention. Jacobs lambasts “mindfulness” as a cure-all that too often devolves into “the cultivation of a mental stance without objects to attend to.” In contrast, he argues, the “only mindfulness worth cultivating will be teleological through and through: it will be mindfulness for something—for personal formation, for service, for love.”
This is thoroughly Nietzschean. Immersion in life—even in its earthiest physicality—matters, but never for its own sake. Our full engagement with life must be goal-oriented or else we float through time like tubers down a lazy river, coming to the end of our journey with nothing heroic or even all that interesting to show for it. As Nietzsche writes in Thus Spoke Zarathustra:
I knew noble people who lost their highest hope. And then they slandered all high hopes. Then they lived churlishly in brief pleasures, scarcely casting their goals beyond the day . . . once they thought of becoming heroes: now they are libertines. To them the hero is grief and ghastliness. But by my love and hope I beseech you: do not throw away the hero in your soul!
Without a “highest hope” in the future, without a goal cast “beyond the day,” mindfulness can devolve into an apolitical hedonism. Not even a Dionysian such as Nietzsche thinks that a good thing.
We need to go deeper than the facile wisdom our culture provides for dealing with our technological malaise. Tech tips and detox weekends aren’t going to cut it. And we could do worse than spend a few hours in Nietzsche’s company, thinking about the three areas we have discussed:
• an “easy life” that too often leaches our will to difficult goals;
• infinite information that overwhelms, distracts, and numbs;
• a constant stream of “elsewheres” that keeps us from full immersion in our own physical lives.
If those three issues still seem impractical, try thinking of them this way: Are they not the very promises made by Netflix, Google, and Facebook?
The dubious virtues of the “last man” resemble the virtues of technology itself, which is never neutral. Unless we provide countervailing goals of our own, technology will push us in the directions of its creators. How confident are we that today’s high-tech overlords have the moral wisdom to shepherd our precious energy and attention toward the creation of our best lives?
To put it more crudely: Is Mark “Facebook” Zuckerberg really the guy you’d pick for a shaping role in your life story?
My own confidence in the technorati has been dropping. After years of writing about how the unsecured Internet has become a cesspool of fraud, spam, child pornography, government surveillance, invasive advertising, and racist chain emails from your uncle Fred, I am less convinced that the designers of our world-altering devices understand the world or the ways in which it should be altered. (I saw the brightest minds of our generation go to Facebook and Google, where they sold advertising.)
Lingering unease with my own digital utopia meant that I could still hear Nietzsche’s warning as a warning. I became convinced that he is not necessarily someone to agree with—but that he might well be worth engaging.
Which brings us to more personal questions about who Nietzsche was and how far we want to walk his road. More than most philosophers, Nietzsche can be uncomfortable, disagreeable, “unsafe” to read. He constantly praises “warfare,” valorizes “strength,” and talks up the superhuman übermensch; all these themes would be linked with later Nazi ideology. His comments about women, especially in the latter half of his work, are of almost no value whatsoever. And, in his late works, he occasionally says things like:
The weak and ill-constituted shall perish: first principle of our philanthropy. And one shall help them to do so . . .
The sick man is a parasite of society. In a certain state it is indecent to live longer. To go on vegetating in cowardly dependence on physicians and machinations, after the meaning of life, the right to life, has been lost, that ought to prompt a profound contempt in society. . . . Ascending life demands the most inconsiderate pushing down and aside of degenerating life—for example, for the right of procreation, for the right to be born, for the right to live.
Ugh. Take Nietzsche as your guru and you will run into all sorts of problems. As one of my philosophy professors told me, “If you’re not offended by Nietzsche, you aren’t paying attention.”
Yet it may help to keep a few points in mind:
• Nietzsche’s insanity was a process, not an event. Whatever caused it—doctors speculate endlessly in medical journals, even today—the disease seems to have progressively lowered Nietzsche’s mental modesty. In the last years of his sane life, his letters betray an increasing megalomania about his “destiny.” Nietzsche’s last few letters after his break are signed “Dionysus” or “The Crucified,” and on his train journey back to an asylum, a friend records that he believed himself to be a king. Much in even his latest works remains valuable, but the process of disease is worth keeping in mind when we recognize, for instance, that the two objectionable extracts above both come from the last months of his sane life.
• Nietzsche often speaks metaphorically. As we shall later see, most of his comments on war and cruelty and strength refer to intellectual and cultural battles—not to literal warfare. Context is crucial. Nietzsche was a relentless critic of German militarism after his own experiences in the cavalry and as a medic.
• Nietzsche “projects” wildly and obviously about many things. His comments about invalids, for instance, were written weeks before his own nervous collapse, which he had felt building for many years. He had spent much of his own life as an invalid who wore an eyeshade in public and whose digestion could handle only weak tea. This does not make his comments less disagreeable, although it might suggest that they are aspirational statements of what Nietzsche wanted for himself rather than a program of cruelty toward others.
• Nietzsche was no anti-Semite. He hated the German anti-Semites, including both the later Wagner and his sister Elisabeth’s husband, Bernhard, a proto-fascist who killed himself upon the financial collapse of his racist colony in Paraguay. Weeks before his own collapse, Nietzsche wrote, “We can assent to no state of affairs which allows the canting bigot to be at the top.” Even after his insanity, he kept up the theme; one of the few letters he sent before being bundled off to an asylum said, “I have just had all anti-Semites shot!”
• Nietzsche was no German nationalist. Christianity, Socrates, Kant, Judaism, fellow scholars, and the late Wagner all come in for plenty of abuse across Nietzsche’s corpus. But the greatest denunciations are reserved for the contemporary German “culture” that Nietzsche loathed, one that celebrated the emerging German Reich and became increasingly nationalistic and warmongering. Nietzsche instead styled himself a “good European.”
• Nietzsche was a lonely and sick man who composed some of the most beautiful German prose and the most provocative German philosophy while struggling even to read and write. His eye pain, migraines, and bouts of vomiting were nearly incapacitating; he took (self-administered and sometimes badly measured) sleeping draughts of many kinds—which had their own horrific side effects. Nietzsche is perhaps at his most moving when occasionally opening up about his weakness rather than projecting a philosophy of “strength” and “health.” While he would have hated our pity, he might have accepted our respect for the way he made his often wretched life into something productive and almost inspiring.
• Nietzsche was a bombastic exaggerator in print. In person, all accounts suggest he was a sober, courteous, soft-spoken man who dressed conventionally and took care over his personal grooming. But on the page, Nietzsche unleashed words like volcanic explosions—powerful but occasionally imprecise.
• Some of Nietzsche’s ideas and arguments were simply bad, cruel, or juvenile. But consider how unimpeachable your own philosophical musings might have been at 27. Most of Nietzsche’s work was written in his twenties and thirties; he went insane at 44.
In other words—Nietzsche was a flawed human being and a creature of his time. Modern critical introductions make this point clear. “There is plenty in each of these books to worry about and argue with, and much that is all-too-human in him no less than in his targets,” writes Richard Schact in his introduction to Human, All Too Human.
“As always in reading Nietzsche, one needs to distinguish between the excited exaggeration even of so comparatively calm a work as Daybreak,” writes Michael Tanner in his introduction to that book, “and the underlying genius of insight and prophecy which the exaggeration often conceals.”
Oxford professor Bernard Williams uses his introduction to The Gay Science to lament Nietzsche’s “recurrent weaknesses”:
There are cranky reflections on diet and climate. His opinions about women and sex, even if they include one or two shrewd and compassionate insights into the conventions of his time, are often shallow and sometimes embarrassing; they were, biographically, the product of an experience which had been drastically limited and disappointing. However, what is most significant for his thought as a whole is the fact that his resources for thinking about modern society and politics, in particular about the modern state, were very thin . . .
It was this last trait that made Nietzsche’s work so amenable to groups such as the Nazis, because many groups could read themselves into Nietzsche’s not-very-rigorous political views. This damaged his international reputation until some years after World War II had ended. Nietzsche was only rehabilitated in English by the eminent scholar and translator Walter Kaufmann, who nevertheless deplored Nietzsche’s weaknesses: “bathos, sentences that invite quotation out of context in support of hideous causes, silly arguments.”
But if you don’t take Nietzsche as your guide and guru, if you instead embrace him as a dialogue partner and provocateur, these limitations need not be a roadblock to thinking with him. Nietzsche would have valued the attempt to wrestle with his ideas—even to reject some of them.
Nietzsche claimed to want no disciples, and he never provided a logical “system” of thought. He believed that forcing ideas into rigorous arrangements would necessarily lead the “systematizer” to falsify both thought and experience. “I mistrust all systematizers and avoid them,” he writes in Twilight of the Idols. “The will to a system is a lack of integrity.”
Nietzsche did, however, express a powerful vision of the good life. It is not a system and it is not a program. It has no “rules.” It is a constellation of goals and values. Instead of tech-driven ease, Nietzsche says, we should embrace creative exertion. Instead of drowning beneath an infinite flood of content, we should seek deeper wisdom through careful curation and information restriction. Instead of a mental, screen-mediated existence, we should find more embodied immersion in the world.
This is not a Luddite vision of technology; it is about consolidating our gains while reckoning honestly with what we have lost. As Nietzsche puts it in a beautiful passage:
We are faltering, but we must not let it make us afraid and perhaps surrender the new things we have gained. Moreover, we cannot return to the old, we have burned our boats; all that remains is for us to be brave, let happen what may.—Let us only go forward, let us only make a move!
Nietzsche will not leave you alone. To take him seriously is to face the possibility that you must remake yourself. You may need to say No to things that feel comfortable. You may need to say Yes to things that sound scary.
“There is no work of Nietzsche’s that does not say to us,” wrote Walter Kaufmann, “ ‘You must change your life.’ ” But that is precisely why Nietzsche is such a potent person to think with.
Can a nineteenth-century German bachelor with a walrus mustache, a penchant for aphorism, and an aversion to alcohol really help us chart a course through our own tech-driven malaise? Reader, come and see.