Wisdom Won by Walking

“Sit as Little as Possible”

“WHO WANTS TO LIVE FOREVER?” SANG FREDDIE MERCURY in the Queen song of the same name, but the answer is obvious: everyone. And when you’re facing the prospect of death, even bad options start to look pretty good.

So when the doctor at the Singularity Clinic shows you the price list for off-loading your consciousness to a machine, and you realize that your life savings fall $150,000 short of the Cate Blanchett–inspired android of your dreams, and that the only receptacle you can afford is an immobile beige workstation with a single 80-watt power supply—well, you lower your standards.

“It won’t be that bad,” you tell yourself as you sign the papers. “I mean, I won’t be attending a lot of cocktail parties once I’m trapped in a metal box that looks like a desktop computer circa 1998—but I can still think! And if I think . . . then I am. And that’s a whole lot better than death.”

Panic hits as you realize that the drive over to the clinic was the last time you will ever smell fresh air. This is unfortunate because, thanks to the canning factory by the river, the fresh air smelled strongly of pureed pumpkin.

“Relax,” you tell yourself as they wheel you into the pre-op room and begin shaving your head. “Rational thought is the great achievement of the conscious mind, and I can enjoy it for centuries—or until my new hardware mines enough bitcoins to upgrade into that Cate Blanchett android.”

Electrode gel appears in industrial quantities. Your head is lubricated. Wires are attached at so many points that you look like some high-tech Medusa. You realize, as you catch your reflection in a silver supply cabinet, that the last “book” you read was a Peanuts comic collection.

“People seem to like thinking,” you observe. “It’s very pleasant. Ask any philosopher! Wait—perhaps I’ll become a philosopher. Peanuts need not define me! It’s not like I won’t have time for deep questions. Maybe I’ll write a book. I’ll write fifty books! I won’t stop writing until I win the Nobel Prize. Do they give Nobel Prizes to philosophers?”

“No,” says one of the interface techs who overhears your mumbling. “They don’t.”

He leans over you, the fluorescent light casting stark shadows around his face as he says, in a classic bit of understatement, “Now this is going to feel a bit weird . . .”

And he flips a switch.

THIS IS NOT, it should be said, how Silicon Valley proponents of “the Singularity” think it’s going to happen. That’s fine. They’re probably right, too—because the Singularity is never going to happen.*

But the thought experiment of off-loading our consciousness into a computer lets us ask a fascinating question that has preoccupied many philosophers, including Nietzsche: Would we think the same way if we were just brains in a vat—or machines in a data center?

Nietzsche was absolutely convinced that the body matters—and not solely as a transport vehicle for the reasoning mind. Recall these lines from Zarathustra:

Behind your thoughts and feelings . . . stands a powerful commander, an unknown wise man—he is called self. He lives in your body, he is your body. There is more reason in your body than in your best wisdom.

The rational mind, though clearly important to a scholar such as Nietzsche, is ultimately not the essence of “who we are.” It is a faculty, arriving rather late in the evolutionary process, that sits atop the body and the senses and believes itself to be in charge—though in reality, it is baffled and buffeted by desires that are based in a biology it does not fully control.

Nietzsche complains about Socrates, described as the arch-rationalist. In Plato’s Phaedrus, Socrates uses the metaphor of a chariot to describe his view of the human person. Reason drives the chariot; passions are the unruly horses that threaten to wreck the ride unless controlled by the driver and his whip. Reason here is—or should be—firmly in control.

Nietzsche doubts this whole way of thinking. What matters more to his project is understanding human passions and drives, which he calls “instinct.” Socrates’s problem was that he posed “ ‘rationality’ against instinct,” says Nietzsche. “ ‘Rationality’ at any price is a dangerous force that undermines life.”

In Nietzsche’s view, rationality is not “ruling” the passions, nor should it be at war with them. Such an opposition is a kind of illness. Nietzsche sought harmony between reason and passion—an arrangement that would keep Enlightenment rationality within its proper bounds. Instinct emerged from evolutionary cognition, and it shows us how to live. We must not simply bypass or quash it. As he says in Twilight of the Idols:

Socrates was a misunderstanding. . . . The most blinding daylight; rationality at any price; life, bright, cold, cautious, conscious, without instinct, in opposition to the instincts—all this too was a mere disease, another disease, and by no means a return to “virtue,” to “health,” to happiness. To have to fight the instincts—that is the formula of decadence: as long as life is ascending, happiness equals instinct.

Here Nietzsche echoes older thinkers such as the Scottish writer David Hume. In his 1740 work, A Treatise of Human Nature, Hume famously attacks the old Socratic idea that reason controls the passions. Indeed, for him it is the other way round.

“We speak not strictly and philosophically when we talk of the combat of passion and of reason,” Hume writes. “Reason is, and ought only to be, the slave of the passions, and can never pretend to any other office than to serve and obey them.”

Reason, for Hume, is a schemer; its job is to get (and to justify) what the passions want. That is why he says, “It is not contrary to reason to prefer the destruction of the whole world to the scratching of my finger. It is not contrary to reason for me to chuse my total ruin, to prevent the least uneasiness of an Indian or person wholly unknown to me.” In other words, reason will try to take you anywhere your passions want to go—even if that involves a trip to crazy destinations.

Nietzsche and Hume represent a tradition that restores instinct, passion, and emotion to a central role in human thought and life. Emotions are not mere “feelings” tacked on to thoughts; we experience the world with and through them. We even think with them.

This idea has been validated by modern psychological research. In his 2012 bestseller, The Righteous Mind, social psychologist Jonathan Haidt collects decades of studies to argue that “emotions occur in steps, the first of which is to appraise something that just happened based on whether it advanced or hindered your goals. These appraisals are a kind of information processing; they are cognitions.” That is why “contrasting emotion with cognition is therefore as pointless as contrasting rain with weather, or cars with vehicles.”

Haidt sides with Hume over Socrates and Plato. But not completely. “I have argued that the Humean model (reason is a servant) fits the fact better than the Platonic model (reason could and should rule),” Haidt writes. “But when Hume said that reason is the ‘slave’ of the passions, I think he went too far.”

Even if nonrational drives power most of our actions, reason can, over time, feed back into and even alter our emotions and instincts. But those instincts are powerful. Changes in their direction only happen as part of a long-term process, which explains why out-arguing someone rarely convinces them on the spot. Reason may be a self-justifying faculty, but it is not only that. (Haidt compares reason to a rider steering an elephant; the elephant is too powerful to be jerked this way and that by the rider, but through continual pressure on the reins, the rider can gradually change the elephant’s direction.)

Nietzsche seems most sympathetic to a Haidt-like view. He clearly disagrees that reason rules the roost, but neither is reason a total slave to instinct. The two work in harmony—but the body and its instincts are the stronger faculty. They are the “great reason,” while the mind and its rationality are only the “little reason.” (“Thoughts are the shadows of our feelings,” Nietzsche says, “always darker, emptier, and simpler than these.”)

This view of what it means to be human helps explain Nietzsche’s emphasis on physical engagement with the world. To sit in a chair, thinking “rational” thoughts as Descartes did, misses so much of what it is to be human—but it also misses so much of what it means to think. The idea that fully orbed human rationality might easily run on computing hardware would be anathema to Nietzsche.

Matthew Crawford touches on these ideas in The World Beyond Your Head. “We think through the body,” he writes after recapping recent work on embodied cognition, a discipline that

puts the mind back in the world, where it belongs, after several centuries of being locked within our heads. The boundary of our cognitive processes cannot be cleanly drawn at the outer surface of our skulls. . . . They are, in a sense, distributed in the world that we act in.

Our biology matters. We are not just Cartesian thinkers, blocking out the winter wind as we scrape our mind back to a tabula rasa. We are not brains in vats. We are not software running in a data center. The real world cannot—and should not—be evaded, minimized, or ignored. Our very thoughts depend on our bodies and the world those bodies encounter.

Nietzsche makes this realization a key part of his philosophy. He could hardly put the point more strongly than he does in this statement, which we have already glanced at, from Ecce Homo:

Sit as little as possible; give no credence to any thought that was not born outdoors while one moved about freely—in which the muscles are not celebrating a feast, too: all prejudices come from the intestines. The sedentary life—as I have said once before—is the real sin against the holy spirit.

Or, as Nietzsche puts it more pithily: “Only ideas won by walking have any value.” The body matters too much to spend our lives staring at screens, which may be why joy is not an emotion usually associated with digital technology. But again and again, the language of joy permeates descriptions of movement, even the basic act of walking.

From Nietzsche to Charles Dickens to C.S. Lewis, thinkers and writers have exulted in their strolls around Sils Maria, their rambles along the lamplit streets of nighttime London, or their tramps through the Oxfordshire countryside. But it was Thoreau who best summed up the essential need to move. “I think that I cannot preserve my health and spirits,” he wrote, “unless I spend four hours a day at least—and it is commonly more than that—sauntering through the woods and over the hills and fields, absolutely free from all worldly engagements.”

Digital technologies, whether by design or by accident, now take so much of our time yet require so little of our bodies. Technologies have their uses, but the dangers too are clear, and Nietzsche raises questions about how long we should spend away from the “real” world.

Real Love

Of course, to put it this way is provocative. It assumes that our devices and the information they provide are somehow less real than books, grapevines, and smelting plants. It assumes that using a smartphone abandons the world. It assumes that digital constructs are ones and zeros, that they exist in a separate realm called “cyberspace,” that they constitute an “information superhighway”—but fundamentally they are not the “real world.”

This assumption underpins much of the talk about bodies and technology; the body and the world are real, while the smartphone and the Internet are unreal. (Or, perhaps, less real.) What we need is more reality in our lives. As Sherry Turkle puts it on the opening page of Alone Together, “Technology proposes itself as the architect of our intimacies. These days, it suggests substitutions that put the real on the run.” Here digital technology functions as a replacement for the real, not as something real in its own right; it is simulation and simulacra.

In this view, you have a “real” connection with the person sitting across the dinner table from you but a less-than-real connection with the person typing words to you in a chat window. It is the attitude behind the online acronym “IRL”—In Real Life—which refers to things done away from the screen and keyboard. One is real, the other is a poorer substitute.

Not everyone agrees with this assessment, and we should think a little about what the actual harms of digital technology might be. Not to get too Matrix-y about it, but what is real, exactly?

Are the computer and the smartphone real? They are made of quarks and atoms like anything else. Do Internet communications live in some separate “cyber” world? They are simply electrical impulses and magnetic fields. Is someone who taps away at a texting app all evening disconnected from real people? After all, they are connected to flesh-and-blood humans in other locations.

The argument is easy to grasp: Everything is part of our one world, so it’s no help to speak about the physical world and the online world as though the two are separate. On a planet where online activities, from dating to terrorist plotting, have “real-world” consequences, the argument is also hard to dispute.

For instance, though a writer such as Andrew Sullivan can describe his “distraction sickness” by saying, “Every hour I spent online was not spent in the physical world,” this is clearly not true. Every hour that Sullivan has spent anywhere was spent in the physical world. Where else might it have been spent?

Once you start looking, it becomes hard to pinpoint where the “unrealness” comes in. This has led to pushback among those who dislike talk about the supposed virtues of the “real” world and the terrifying danger of the “unreal” Internet.

Nathan Jurgenson, a sociologist who went on to work at Snapchat, sees all this worry about “the real” as a pointless moral panic. People who talk about putting down their smartphones and going out to split logs instead are, he thinks, just virtue signaling. In his 2012 essay “The IRL Fetish,” Jurgenson powders his musket and lets fly at people like Turkle and Sullivan:

What a ridiculous state of affairs this is. To obsess over the offline and deny all the ways we routinely remain disconnected is to fetishize this disconnection. Author after author pretends to be a lone voice, taking a courageous stand in support of the offline in precisely the moment it has proliferated and become over-valorized. For many, maintaining the action of the collective loss of the offline for everyone else is merely an attempt to construct their own personal time-outs as more special, as allowing them to rise above those social forces of distraction that have ensnared the masses. “I am real. I am the thoughtful human. You are the automaton.”

In his view, the online and the offline are both real. We should see them as “enmeshed” instead of as wholly separate spheres. Consider: Though we put our device in a pocket, we might still be pondering our most recent text and crafting a reply. Though hanging from a rope halfway up an indoor climbing wall, we might be focused less on our feasting muscles than on how pictures from the climb will look on Facebook. Life moves fluidly between online and offline. “The logic of social media follows us long after we log out,” Jurgenson writes. “There was and is no offline; it is a lusted-after fetish object that some claim special ability to attain, and it has always been a phantom.”

Well, sure. But this seems more like an indictment of a performative culture preoccupied with recording and broadcasting life. Jurgenson’s essay doesn’t consider that people might not like the constant feeling of performing their own lives for the benefit of social media viewers. It certainly is useless to fault the “online” world if you spend all your “offline” time thinking about it—but that might argue instead for cultivating a more Dionysian embrace of the present moment.

Jurgenson’s argument goes off the rails, though, when he claims that people who want to embrace the “real” physical world beyond their screens have actually never had it so good.

We have never appreciated a solitary stroll, a camping trip, a face-to-face chat with friends, or even our boredom better than we do now. Nothing has contributed more to our collective appreciation for being logged off and technologically disconnected than the very technologies of connection. The ease of digital distraction has made us appreciate solitude with a new intensity. We savor being face-to-face with a small group of friends or family in one place and one time far more thanks to the digital sociality that so fluidly rearranges the rules of time and space. . . . Never has being disconnected—even if for just a moment—felt so profound.

As humans, we are terrifyingly good at tuning out repeated stimuli, so contrasting feelings do heighten sensation. But this does not seem like an argument for embracing every contrast in order to feel something more powerfully. We feel our surging good health with profound gratitude after recovering from strep throat—but who would then invite illness?

Those who argue against “digital dualism” are clearly correct in one sense; there is no ontologically separate “cyberspace.” Everything is real. What happens “there” affects us “here,” and vice versa. But this argument misses the point. It doesn’t address the sense so many of us have that online interactions feel different; if you were one of those who spent the recent pandemic fighting off “Zoom brain” after one too many videoconferences, you might know what I mean.

Over hundreds of thousands of years, humans have developed a delicate social sensitivity to everything from body position to scent to flared nostrils to physical proximity. Stand 6 inches closer to someone and you send a suggestion—perhaps of aggression or romantic interest. Perhaps you are a “close talker” in the Seinfeld sense. Perhaps the other person likes this; perhaps the other person does not. This is all much harder to judge—and some of these signals disappear entirely—when talking to someone with their webcam angled up at the inside of their nose.

Digital tools do not operate in a fully fitted way with our evolved instincts and social behaviors. Just as the telephone stripped away everything but the voice, online interactions have truncated other aspects of human connection. Our machine-mediated connections are powerful tools, and well worth keeping, but they also lack the full expressiveness of interpersonal interaction. This has its benefits, as everyone can attest who has taken their cell phone into the bathroom during a conference call, but it won’t satisfy the fullness of our need for connection.

That connection takes place in a world that has evolved under many pressures and thus to serve many purposes, but digital tools are tailored to appeal only to us. This is true of human tools in general, of course, but most of those tools require us to use them in (and against) the physical world. We shovel out a post hole or drive cars down country lanes, but digital technology reduces this “in-the-worldness” to the bare minimum: me, alone, and largely immobile.

Removing all that external friction means that digital tools can be precisely designed to stimulate our brains’ pleasure centers. The “real world” offers no resistance here, as it does to the shovel or the car tire, until perhaps we start craving a quesadilla. As Cal Newport, the computer scientist, puts it, “Computer interfaces, and the increasingly intelligent software running behind the scenes, are designed to eliminate both the rough edges and the possibilities inherent in directly confronting your physical surroundings.”

But the friction of the world, its independence from and resistance to our human plans, is not solely a source of frustration. It is also a challenge and a teacher. We test ourselves against the world beyond us, finding out through the snapping of a maple branch and the sharp sensation of being winded exactly how gravity works, how much load a tree can bear, the way certain objects flex before breaking, and the height of drops we can survive. Fixing a tire swing to that same tree calls forth that knowledge and presents it to our creative mind as a problem worth solving. Solving it correctly brings the unique satisfaction of a job done right, a pride in craft, and the scent of rubber in the summer sun. The knowledge we gain here is quite different from the same knowledge expressed in physics formulas, but it does bring with it the danger of standing atop a ladder, the possibility of bees, and a second aggravating trip to the hardware store because the carriage bolt you bought the first time was the wrong size.

Highly engineered digital spaces, though they may offer education and encourage creativity, don’t do so in the same physical way. What we get is constant mental stimulation—from texting, from following one more link, from playing one more round of Candy Crush—that is so much more immediate in its gratifications than sunshine and an old tire. Marry this engineering to social media, with its metrics about likes and shares, and our identity can seem at stake online in a way it may not in the backyard.

In such crafted spaces, “the natural world begins to seem bland and tasteless, like broccoli compared with Cheetos,” writes Matthew Crawford. “Stimulation begets a need for more stimulation; without it one feels antsy, unsettled. Hungry, almost.”

Today this engineered activity comes to many of us through the single screen of the smartphone. Although the hardware, the cell towers, the backhaul fiber optics, and the people using the system are all part of the one “real world,” I don’t think it’s so difficult to see how a screen-mediated approach to contemporary life might feel somewhat . . . different.

Different here is not code for “worse”; looking at the amounts of time people spend with their devices, it’s clear that digital diversions are perceived as better than the analog, offline world. They offer a concentrated experience where we can reach anyone and learn anything and feel endlessly entertained without any of the downtime we encounter IRL.

In this sense, our engineered digital reality feels like utopia—the ideal of stimulation and novelty. Such utopias have always been a part of human dreams, whether we’re talking about our longing for heaven or Plato’s longing for the Forms. Those dreams all construct “other worlds” different from the actual one we inhabit. If used in the wrong way, these other worlds may entice us away from full participation in this one.

Nietzsche abhorred this common feature of our human predilections. Other worlds can serve many purposes, such as encouraging selfless behavior here and now, but Nietzsche distrusted them all as a dangerous form of escapism. He talks about this repeatedly across many books, but a short passage from Twilight of the Idols sums up the thought:

To invent fables about a world “other” than this one has no meaning at all, unless an instinct of slander, detraction, and suspicion against life has gained the upper hand in us: in that case we avenge ourselves against life with a phantasmagoria of “another,” a “better” life.

This may sound extreme (surprise!), but I’m not sure Nietzsche is wrong. Perhaps, like me, you find yourself picking up your phone or cracking open your laptop as a sort of restless instinct. It’s not from boredom—many of us have too many things going on, too many options, to actually be bored—and it may not happen for any particular reason. The most minute of hassles, the most microscopic of delays, even a pause in conversation and our attention shifts to the device.

In these moments are we not, in some sense, slandering life? We crave a level of novelty and stimulation that the world rarely offers, and we show through our actions that we don’t really think that much of life’s pace. We are ungrateful for what has been given. In small doses, this may not cause problems. But if we mainline digital stimulation, it’s not hard to see how this attitude becomes a subtle—but pathological—rebuke to the world. We know better.

In opposition to this attitude, Nietzsche conducted his great experiment in amor fati, the “love of fate.” He wanted to become a Yes-sayer who stopped worrying about life’s unfairness, who did not brood over past slights, who accepted what the world doled out. One does not get to pick and choose; everything, in some mysterious sense, is ultimately all right. Even suffering and death are part of the pattern.

Nietzsche did not completely succeed in this. In Ecce Homo, after a lengthy passage berating his friends who didn’t bother to study his writings, Nietzsche complains that “ten years have elapsed, and no one has yet felt it a duty to his conscience to defend my name against the absurd silence beneath which it has been entombed.”

This does not sound like a man at peace with whatever happens to him, but Nietzsche suddenly seems to remember that petty personal slights are not supposed to bother him. He adds, unconvincingly, “For my part, these things have never caused me any pain; that which is necessary does not offend me. Amor fati is the core of my nature.”

Amor fati sits uneasily beside Nietzsche’s call to creatively transcend oneself. Accepting every horror that life can dole out might sound like a recipe for quietude rather than self-overcoming—the poor and the colonized putting up with the depredations of the rich and the strong. It’s fate!

But Nietzsche is not commending “resignation” even to evil. He encourages us to change our lives and the world, while at the same time accepting our creature-ness and our limits without resentment. This can be a tricky tightrope to walk, but Nietzsche is saying something much like comedian Stephen Colbert, who has reflected on the plane crash that killed his father and two brothers back in 1974. “It’s a gift to exist, and with existence comes suffering. There’s no escape from that,” Colbert said in a 2019 interview. To be grateful for one’s life means “you have to be grateful for all of it. You can’t pick and choose what you’re grateful for.”

This is the attitude Nietzsche tried to adopt. If he sometimes sounds like a man trying to convince himself of something, we shouldn’t hold that against him. Nietzsche’s suffering and personal failures humanize this man who liked to present himself as a solitary sage hiding behind a mustache, drinking his weak tea, and plotting the self-overcoming of the species from affordable rented rooms.

Have we wandered beyond the borders of a book concerned with technology? If so, the diversion may be a useful one. We need not accept amor fati as Nietzsche tried to live it out, but we might absorb something of his attitude—a sense that, though we strive creatively against life’s limits, we are not running away.

We can say Yes to this life, to remaining in the moment with people, to letting our thoughts trail off into boredom, and we can find a certain sturdy joy in experiencing the world as it offers itself to us. We may find at the end of it all that subtler pleasures, often ones beyond our screens, are the ones that endure.

This is not a commandment but perhaps a useful principle by which we judge for ourselves whether our technology use has become . . . unreal.

The Triumph Weakness of the Will

Let’s grant that we want to engage our fully evolved human-ness in and against the given world. We want more than keyboards and glass. But arriving at this conclusion does not necessarily translate into less screen time—in part because “willing” ourselves to do something is hard. Tired, bored, hungry, sad, or just yearning to “swipe right” on Tinder? If you have trained yourself to find novelty, entertainment, or dating partners through your phone, you will have to resist the urge to start tapping away at it—regardless of any mental resolutions to the contrary.

If reason sits atop our passions, habits, and instincts like a rider atop an elephant, it cannot succeed by simply asserting dominance; the elephant is far too powerful. Perhaps the rider can jerk the reins hard enough to compel a slight change of direction, but the effort fatigues us.

But cheer up: Reason is a crafty rider, and what it can’t quickly get through sheer force of will, it might attain through guile. When it comes to altering behavior, reason works better when it can structure reality in advance instead of relying on constant “in the moment” decision-making.

When you go to Gran-gran’s on Sunday afternoons to play Scrabble and eat shortbread cookies, you could consciously decide not to check your phone every time it blorks and beeps. You might just, you know, tune it out. But, if you are anything like me, your mind can’t stop wondering about which fascinating people might be texting you. What life-changing communications lie behind that lock screen? You may think Gran-gran doesn’t notice your distraction, but she does—and she is silently judging you from behind her Scrabble tray.

You are more likely to keep your attention where you want it if you silence your phone’s notifications, turn it off during your shortbread-munching visit, or even leave it at home for a few hours. You have shaped the environment for a shot at success. Is this a simple, fleeting trick played on our habits? Not necessarily. We may find over time that such external tweaks to our situation actually change our internal desires.

Nietzsche believed strongly that change works on us from outside in, actions influencing our feelings. Many of our “chronic illnesses of the soul . . . are very rarely due to one gross offence against physical and mental reason,” he writes, “but as a general rule they arise from innumerable and petty negligences of a minor order.” Therefore, fixing these interior illnesses may be accomplished by making practical changes “even in his least important habits.” Matthew Crawford, who has written much about skilled practices and craft, notes a similar discovery: “Habit seems to work from the outside in; from behavior to personality.”

Lofty ideals about taking action in the world are fine, but the energy to take that action may be missing if we sit for 12 hours a day, don’t exercise, and skip breakfast. Can something as trivial as a bagel with a side of scrambled eggs cure our souls?

The point of a book like this one is not only to encourage certain mental conclusions about the role of technology in our lives; it is to shape our practices so that they embody these conclusions. Here are a few approaches for shaping yourself and your environment in ways that help you venture out into the world.


Paying attention is a skill. Given the ways gadgets can hobble our ability to focus, it’s useful to build consciously “attentional” time into the weekly schedule. This might involve anything from board games to books, bird-watching to worship.

Reading is perhaps an obvious place to start, because so many people complain today about the loss of the pleasure they once took in books. Additionally, it is inexpensive, often enlightening, and it requires nothing more exotic than a chair, a cozy lamp, and a cup of tea. (You can read on a tablet or smartphone, but given the ease of distractions, it’s less recommended for trying to improve one’s capacity for attention.)

Keep doing what you’re doing until you feel that tug of distraction—then push beyond it for 5 or 10 minutes more. Start small and fail often. It took me a year of sustained reading to recover my ability to read deeply.


Perhaps you, like me, are occasionally particular about small things. These preferences may drive your parents, spouse, or friends batty, but you aren’t trying to be difficult; you just know that little things matter. You also know that your preference is objectively correct. If others can’t see that, it’s their loss.

If I’m brainstorming crazy ideas for a book on Nietzsche and technology, I can’t use ruled paper. It’s not possible. All those horizontal lines lock me into complete sentences, when I still need wild and fragmentary freedom. Blank paper works in a pinch, but my loudest mental thunderclaps form on a specific brand of colored graph paper, crisply printed in Spain, with 5-millimeter grid spacing and microperforations down the side for clean removal. It offers structure without constraint. Writing on it—using a Pilot G-2 fine-point pen, naturally—is a tactile and aesthetic pleasure.

I am aware of how precious this sounds.

But the paper matters. Brainstorming on a computer has a completely different feel, and I don’t get the same results. I need the freedom to make lists here and there, to draw wild lines between them, to circle and to highlight, to doodle in the corners as I think. I need to run my hand over the page, to feel ideas flowing out through ink. I need that most sublime of sensations—the pleasure of blotting out a terrible idea with vicious pen strokes of disdain.

The digital and the analog worlds are not isometric with one another, even when it appears that the same activity is being performed. In something as simple as a word processor, which appears to mimic the blank page, we can quickly sense a difference.

Scholar and critic Alan Jacobs gets at this difference in several of his theses regarding technology. He illustrates his point by discussing handwriting. He argues that:

• Everyone should sometimes write by hand, to recall what it’s like to have second thoughts before the first ones are completely recorded.

• Everyone should sometimes write by hand, to revisit and refresh certain synaptic connections between mind and body.

• To shift from typing to (hand)writing to speaking is to be instructed in the relations among minds, bodies, and technologies.

Pay attention to these sorts of differences. Write a card by hand and take it to a co-worker’s office rather than sending an email. Do your next PowerPoint presentation with a large paper flipboard instead—and draw all the art in black Sharpie. Call the library for directions rather than using the GPS.

The point is not to see how the analog way is “superior”—it may not be—but to understand the qualities of each approach. How does each way of doing things affect you? Other people? The world around you?


Take your expanded attention span and apply it to skilled practices where you must develop both creativity and technical acumen. Find something that you can do outside of work that is refreshing. Take watercolor lessons, learn to brine a ham, pick up the flute, practice dovetail joinery, bake your own sourdough bread, go ballroom dancing, strap on some ice skates.

In the past few years, for instance, I have taught myself to play bass guitar, to install new Shimano click shifters on my daughter’s bike, to paint on 140-pound cold-pressed paper, to compose guitar instrumentals, to skim coat a wall, to dance with my wife. Each activity required more than mere attention; it required certain tools or certain skills, and each had its frustrations, but I have never regretted a moment of this time. In the end, I exerted effort against the world to create sound or watercolor paintings or gorgeously smooth drywall, and this felt productive in a way that consuming, watching, and purchasing rarely does.

Technology can be an amazing partner here; it is often terrific for learning skilled practices. The return of “artisanal” everything may have emerged out of a sense that industrialization has destroyed the unique—Nietzsche complains about this at some length—but sites such as YouTube now offer on-demand expertise in everything from drawing a laughing avocado to fixing the sink drain that someone installed incorrectly. With the skills listed above, technology played an important role in each, whether that involved how-to videos, composition software, online courses, or the music lessons I took via videoconference during the pandemic.

One caveat: If you are a knowledge worker and spend your day staring at a screen, perhaps don’t start with a sedentary, screen-based project such as writing the first great novel about Michigan’s Upper Peninsula. Yes, it’s skilled and creative work, but it also feeds into the same postures and habits of the workday. The goal here is to test yourself in new ways against the world, to use the body differently, and to improve the world or your own life through creation or maintenance. Whatever you do, you’ll make Nietzsche proud.


Albert Borgmann, the Montana professor and philosopher of technology, has written for decades about the “device paradigm” under which we live. His most famous example involves the hearth, which for many centuries created the main focal point of a house. The hearth provided heat, which in colder climates naturally gathered the family. It also provided space and energy for cooking. In doing so, it dictated family roles across a day: Father might chop wood in the morning, while mother did the cooking in the evening, and children might be called upon to keep the indoor woodpile stocked. The family gathered around the hearth for meals and an evening’s relaxation. Domestic bliss?

But hearths require hard work. Someone has to rise in the chill of the morning to start the fire. Splitting logs by hand, which I once attempted at a “living history” farm, is more difficult than it looks—and much more likely to injure you. Ash has to be constantly removed from the fireplace; sparks have to be watched. Bats and birds might fly down the chimney even as creosote builds up inside it.

Long ago we replaced the hearth with central heating. The furnace does not sit at the center of the home; instead, you must descend rough stairs to the back of the basement, where the heating apparatus lies entombed in its windowless room. The “device paradigm” values ends more than means, so all the “means” are hidden in the walls. We hear only whispers of air blowing through vents; we see only a thermostat mounted on the wall.

With this change, the “focal” property of the hearth was broken. Furnaces do not require family jobs, daily routines, and physical exertion. They do not require the same skill to use. And they do not bring people together. In fact, by efficiently heating all parts of all rooms, they make it much easier for family members to disappear into separate spaces.

Borgmann worries that our embrace of the device paradigm has removed the need for skill and craft from too many parts of our lives. In doing so, the device paradigm has leached meaning from life; we feel somewhat deadened. Borgmann advocates a recovery of “focal practices” that we value for their own sake, things such as gardening, reciting poetry, making and sharing meals, fishing, writing letters. These overlap with the skilled activity discussed earlier, but the key here is that they demand our concentrated attention. So far, so Nietzschean.

But Nietzsche, the lonely individualist, dances on the heights as he looks down at the “herd” in the valley. We have seen how this perspective can be used to call people to excellence, though it may be a lonely or elite achievement. Borgmann, however, worries about too much individualism. He is especially concerned about the ways in which technology isolates us in our cars, our cubicles, our homes, our bedrooms—perhaps even on our respective sides of the bed, where two people tap away on their iPhones in the late evening.

So Borgmann places great stress on communal focal practices such as the evening meal, the sharing of stories in a pub, musical performances. Such practices do more than put people into proximity with one another; they knit together the attention offered by each participant to create something meaningful for the group. These practices benefit from the same kind of scaffolding and structure we have been discussing; willpower “in the moment” cannot by itself pull a collection of people into a shared activity.

Consider the dinner table. Even after the physical hearth faded in focal importance—though note how many living rooms remain oriented around a rarely used fireplace—the “evening meal” tradition gathered families daily. This meal is common enough that we may not even think of it as a tradition. It sounds simple. Do we not just sit down and eat? But dining together has its share of rules and rituals: particular times; the use of certain vessels, plates, and utensils; the call to be seated; the washing of hands; perhaps a prayer or a thanksgiving; a procession of courses; the washing up.

For this communal focal event to take place, customized spaces and objects are required. A kitchen collects counter and cooking space, ingredients, and knowledge (recipes and cookbooks). Cupboards keep our plates and pots near at hand; the silverware is in the drawer; the table and chairs stand ever ready. But even with these specialized arrangements, the focal event does not take place without daily practice. We take action, and we take it repeatedly, to turn this collection of spaces and objects into a profound focal activity.

It might be simpler in every way to microwave frozen meals and eat them in front of the TV; it might be easier; it might even be more fun. But it might not be as rewarding.

What can we do to create spaces in our lives that bring people out of their comfortable, technology-created cocoons and help them focus their attention in shared experiences that create and sustain communities? The answers will vary wildly by location and life situation, but structure matters here. One-off attempts to bring people together are terrific, but they also take a terrific toll in terms of energy and effort.

If you want to bring family members together around musical creation, create a space that makes it likely. You don’t need a wood-paneled music studio, just a corner of the living room with a digital keyboard, a djembe, maybe a guitar. Throw in a music stand and store chord charts or songbooks on the bookshelf. You’ve just lowered the barrier to entry—in under a minute, people can start playing.

As with the evening meal, however, the best preparations may lead to haphazard results without regular practice. Planning ahead is not always popular, but “text me when you want to play, and I’ll see if I’m free” is less likely to produce good results.


Did I get annoyed when attending a Josh Ritter concert where I had to sit through a 1-hour opening act I could have done without? Absolutely. Were expletives silently directed at the person in front of me who simply could not stop talking between songs? Possibly. Did I pound my hand on the steering wheel when an overturned truck shut down the exit ramp I needed to get home? No comment. But what I gained was an experience not reproducible on my far more convenient couch at home.

Nietzsche tells us to look skeptically at the easy, controlled life, while our devices beckon with their total control over what we see, what we hear, and how we communicate. If we have become control freaks, we should seek situations that we cannot fully manage. This does not need to involve ecstasy, dance music, and an abandoned warehouse. Start small—even a face-to-face conversation can be difficult for those with social anxiety who might prefer the time for thought offered by texting and email.

Why attempt to make our peace with a world we cannot control? Because this is the fundamental reality, one merely masked by our devices. The control that they offer us feels significant but is of such a limited kind. In life’s less predictable moments we can encounter the sorts of surprising experiences—both good and bad—that are less likely to occur in the world that we have ordered.


The big tech companies now recognize the lifestyle problems that accompany their best-selling devices, and all of them offer tools to help. Take advantage of notification and “do not disturb” settings on your phone. Consider “full screen mode” on your laptop to avoid distractions or switch off the WiFi for an hour while you complete a project. Close email when you’re doing something else. Pick up an app that blocks time-wasting websites during certain hours—and let a friend set the password. Remove apps that soak up your time. Use the tools provided by companies such as Google and Android to manage your digital well-being and to track your screen time.

The goal is reducing the workload on your willpower. Set up the scaffolding around your digital life so that it directs you by default in the directions you want to go and blocks off paths you know aren’t good for you to follow. In this way, you can up the odds that your use of digital devices will be intentional rather than haphazard and distracting.


Finally, if you want to encourage more interaction with the world, find times to set your devices aside. Just as we discussed fasting from information, consider fasting from the devices through which we access and process that information.

I survived an entire childhood of rides in a Chevy station wagon with a rear-facing “third seat,” which we generally used without seatbelts. The car had a cassette player but no air conditioning, so it was difficult to hear any music over the sound of wind whooshing through the open windows at highway speeds. Our GPS was a decade-old road atlas.

I made it through two decades of this savagery—so I can probably make it 4 hours without carrying an electronic tether around. You can too.

Our devices as not as essential as they want us to think, and the world is not always as dangerous as we fear.

The Body Is Not a Battery

Imagine a software engineer, just out of college, who is bent on retiring by age 35. In pursuit of this dream, she works hundred-hour weeks. She has slept at her desk three times this year, but she knows the importance of caring for her body. She starts each day with a nutritionally complete shake mixed with a raw egg—not because she likes the taste of artificial vanilla, but because it is an efficient way to down enough calories before her morning commute.

Stiff at the end of a long day spent in a chair, she has developed a pain in her “mousing” shoulder and recently began yoga at her company’s gym. On non-yoga days, she swims. She would rather spend Sunday afternoon napping on the couch, but she dutifully gets up each week and jogs along the lakeshore to keep her heart in shape. “Gotta stay healthy,” she tells you.

(Were you imagining California? For shame. She lives in Wisconsin. She’s proud of Wisconsin! And she can’t stand people who assume that everyone who knows how to program a computer has migrated to San Francisco. Wait until you hear her rant about how much further one’s housing dollar goes in the Midwest.)

She ends each day with a cup of tea and a book; no screens, because she worries that the blue light keeps her awake.

Good for her! But also, in a way, maybe not so good for her? Alicia—yes, her name is Alicia—has embraced her embodiedness—but only as a means to an end. Her body is a machine, and like all machines, it needs a certain amount of maintenance. But the goal of this maintenance is, like most maintenance, about keeping the machine working. How easily Alicia has become a creature of productivity, even as her ultimate goal is escaping the daily grind.

Nietzsche questions this orientation toward work. Where are the joys of the body for their own sake?

Oh, this moderation in “joy” of our cultured and uncultured classes! Oh, this increasing suspiciousness of all enjoyment! Work is winning over more and more the good conscience to its side: the desire for enjoyment already calls itself “need of recreation,” and even begins to be ashamed of itself. “One owes it to one’s health,” people say, when they are caught at a picnic. Indeed, it might soon go so far that one could not yield to the desire for the vita contemplativa (that is to say, excursions with thoughts and friends), without self-contempt and a bad conscience.

What we want to avoid is seeing the body as a rechargeable battery that exists to keep the mind going. This would be to disregard the wisdom of the body and to treat it merely as a means to a more productive end. And there’s little joy in such an approach.

The Return of Dionysus

You and I are not in the habit of having infants torn to pieces, but this was the sort of jealous nonsense that the Greek gods lived for. Dionysus was one of many children spawned by the philandering Zeus, and Zeus’s wife Hera was displeased. She revenged herself on the child, who was dismembered and then eaten. Grisly stuff! But Dionysus wasn’t quite dead.

Here’s how the staid Encyclopedia Britannica tells one version of the tale:

Dionysus—under the name Zagreus—was the son of Zeus by his daughter Persephone. At the direction of Hera, the infant Zagreus/Dionysus was torn to pieces, cooked, and eaten by the evil Titans. But his heart was saved by Athena, and he (now Dionysus) was resurrected by Zeus through Semele. Zeus struck the Titans with lightning, and they were consumed by fire. From their ashes came the first humans, who thus possessed both the evil nature of the Titans and the divine nature of the gods.

(Still exciting! There’s no way to keep a good dismemberment story down.)

The rending and resurrection of Dionysus took on significance among some Greek mystery religions. Dionysus became a symbol of original unity being torn into individual pieces, then re-created as a unity. This is what Nietzsche means when he says, in The Birth of Tragedy, that the mystery religions taught “the basic understanding of the unity of all things, individuation seen as the primal source of evil, art as the joyful hope that the spell of individuation can be broken, as a presentiment of a restored oneness.”

Over the course of his work, Nietzsche becomes an arch-individualist, but he never abandons the idea that we moderns too easily lose our connection to primal reality. Our temptation, colored by the lens of our technology, is to say that we are separate from the animal passions, from the cruelty of the killer whale playing with the baby seal before biting it in two. We convince ourselves that we are masters of our desires, but Nietzsche’s genealogical investigations remind us how much a part of this evolving world we are.

Living among rational Apollonian types who could no longer escape their own heads, Nietzsche seeks a return to oneness with the world. “This hope alone casts a ray of joy across the face of the world, torn and fragmented into individuals,” he says.

In a preface written years later, he scoffed at his own book. “Today I find it an impossible book,” he says. “Badly written, clumsy and embarrassing, its images frenzied and confused, sentimental” and “lacking in any desire for logical purity.”

But the fact that The Birth of Tragedy has endured for more than a century tells us how much the ideas have spoken to people. (Well, the ideas in the first half, anyway. The second half, with its praise of Wagner, is widely regarded as rubbish.)

Nietzsche is not against thought, reason, or Netflix. But he sees the way our post-Enlightenment age has made it easy to reason, to watch, and to worry our way through life. We operate at a remove from the world.

Our digital devices are part of this trajectory. They provide a constant stream of individualized interruptions and on-demand distractions that feed our minds, at the expense of our muscles. They isolate us heads-down from one another. They insist on the screen’s priority to mediate reality. And in doing so, they often show elements of arrogance. “Today our whole attitude towards nature,” Nietzsche complains, “is one of hubris, our violation of nature with the aid of machines and the thoughtless ingenuity of technicians and engineers.”

Nietzsche, the former classics professor, challenges this info-heavy, arch-rationalist way of living. Head knowledge is not always the pathway to self-knowledge, and thought is not the only way we seek and find reality. In the preface to On the Genealogy of Morals, Nietzsche quotes Jesus’s own words:

We are unknown to ourselves, we men of knowledge—and with good reason. We have never sought ourselves—how could it happen that we should ever find ourselves? It has rightly been said: “Where your treasure is, there will your heart be also.”

Nietzsche, Borgmann, Crawford, Jacobs—they all suggest, in different ways, that too much treasure has been gifted to our technology, which is not worthy of it. Our hearts can be found in our cars, our televisions, our smartphones, or scattered into bits across the Internet.

But we too easily forget what the body teaches us about the joy it takes in the world. We are not brains in vats; we are not made to live as androids. We are part of the world in all its pain, and we show our creativity and we test our strength against its resistance. In running, in eating, in making love, in mastering a craft, in maintaining a focal practice, in paying true attention to another, we lower the volume of the Apollonian voice inside us—and we turn up the Dionysian.

“I think, therefore I am”? Sure. But also: “I act, therefore I am.”

* If I am wrong about this, I ask any of my descendants reading this passage to devote their own Singularity-enhanced lives to the study of time travel so that they can come back and rescue me.

If you find an error or have any questions, please email us at Thank you!