Chapter Seven

How Victimhood Leads to National Decline

Bread and Circuses

Bread and circuses. That’s all the Roman people wanted, according to Juvenal, a poet speaking in the early second century, at the height of the empire.

It was toward the end of the Pax Romana, the Roman peace, two hundred years of relative peace and prosperity, when the empire stretched to its furthest extent. Juvenal was critiquing the politicians’ strategy of keeping public approval by offering free grain and games instead of attempting to instill some civic virtue, some shared sense of purpose. He said, “Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions—everything, now restrains itself and anxiously hopes for just two things: bread and circuses.”1

At the same time, to pay for its growing bills for military, administrative, and social projects, Rome had been debasing the silver in the denarius to mint more money for centuries. This was in some ways Hannibal’s revenge, his most lasting legacy for Rome. The Romans had gone broke during their long war against Hannibal, and so that was when they first answered the siren call of minting money to fund public projects, diluting their bronze currency. Printing money was a drug that grew addictive, as many civilizations since have found.

Rome’s currency eventually became worthless, but it saved itself through plunder: around 212 BCE, its armies brought back vast wealth from Spain and Sicily.2 It used that silver to create a new currency, the denarius. But eventually, it decided to try minting more denarii to fund all its projects, which it did by reducing the silver in them.

By the time of Marcus Aurelius, the emperor and Stoic philosopher who reigned just after Juvenal’s time, the silver content of the denarius had dropped to about 80 percent. Though that seemed manageable, over the empire’s long life the debasement of the denarius steadily compounded.3 Creating so much money for so long had never been a good idea. Inevitably, inflation rose and eventually became hyperinflation; the government had to raise taxes higher and higher to truly fund its projects, even as it minted more denarii. By 265 CE, there was almost no silver in the denarius at all, only 0.5 percent. Prices rose 1,000 percent per year. Gold was reserved for paying barbarian mercenaries.4

By the end of the third century Roman currency was worthless again, and unlike the last time, it couldn’t plunder new precious metals to replenish its treasury. Its economy was strangled, as no one was willing to trade for its worthless coins, and trade became local, limited to whatever could be accomplished with primitive forms of barter. The fall wasn’t far.

Like Romans, we Americans crave bread and circuses, and we print lots of money to pay for them. We have recreated the bargain between Rome’s politicians and its citizenry at the beginning of its decline: the leaders offer bribes to the populace to keep it fat and happy, funding them with money the government doesn’t really have, and in return the people allow them to stay in power. The citizens think only of the material goods they want the government to give them, the politicians think only of how to outbid each other to offer those, and no one thinks of the long-term good of the nation.

That’s the essence of Juvenal’s bread and circuses critique. It’s still valid today. He was describing what happens when a nation of underdogs becomes a nation of victims. When a nation peaks—especially when an empire does—its focus shifts from producing wealth to arguing over how to distribute it. This fixation on arguing over how to split the pie goes hand in hand with victimhood complexes: arguing that you’re more oppressed than someone else is a powerful way to claim you deserve government largesse more than they do. Victimhood focuses on how to divvy up the economic pie, rather than how to make it bigger. In so doing, it actually makes the pie smaller.

This process of decline has actually been going on for decades. Charlie Sykes identified it almost thirty years ago in his well-titled A Nation of Victims, which I only learned about midway through writing this book. In Sykes’s version of the tale, the peak of America’s happiness occurred in the postwar economic boom of the 1950s.5 But, he adds, “As it turned out, happiness proved surprisingly elusive for modern man. Expectation proved to be inseparable from anxiety, while the reliance on self often cut off man from his neighbors and left him a prisoner of his own sense of entitlement.”6 In other words, national success is a double-edged sword. National achievement breeds national entitlement. Once the underdog tastes victory, they only want more, and that creates constant anxiety, a gnawing fear of loss. From there, Sykes presents a detailed account of how that diffuse anxiety was sharpened and entrenched by a professional class of therapists who profited from treating it. After that, clinical diagnoses of different forms of victimhood flourished like weeds.7

Sykes and I share a similar starting point. We agree that national success sows the seeds in which victimhood takes root. But while Sykes focuses on anxiety as the psychological response to success that enables victimhood, I think there’s another important character trait that operates alongside it: laziness. It is the combination of the two that turns underdogs into victims. After all, underdogs experience a kind of anxiety too, but that drives them to overcome the obstacles they face. It’s the addition of laziness to the equation that makes underdogs demand that others overcome their anxieties for them.

The Pro-Laziness Movement

Government policies have created a culture of laziness in recent years—mostly in ways that you’d expect, but some you wouldn’t.

First the COVID-19 pandemic opened up the spigot of government aid. Initially that aid went to families that, in many cases, genuinely needed it because of government-mandated business closures that prevented people from working. Yet as those lockdowns loosened, the benefits provided to people who stayed home remained intact. The government never turned off the tap.

Notably, this public policy was supported not only by most Democrats, but also by prominent Republican legislators like Senator Josh Hawley and President Trump, who refused to sign an aid package into law unless it contained a higher threshold ($2,000 vs. $1,600) of government aid to families. This may have been a populist policy to aid his reelection bid, but in any case it’s notable that Trump, Hawley, Bernie Sanders, and Kamala Harris, for example, were all on the same side of this issue. It’s part of the age-old promise of bread and circuses: it’s legal to bribe citizens to reelect you as long as you do it with the government’s money. Of course, you’re ultimately bribing them with their own money, and diluting its value through inflation too.

Bluntly, this cornucopia of free money has contributed to a culture of laziness that has resulted in the greatest labor shortage in the United States in over a generation. People simply became accustomed to not working—and quite liked it. White-collar employees enjoyed “working” from home with a measurable downtick in how much they were actually completing work, in my experience. So far, we’re still early in the process of formally studying it, and the existing evidence is mixed.8

Many employees certainly feel more productive when working from home, largely because they get to skip their commutes, but they’re also less able to have spontaneous conversations with coworkers and receive career mentoring. And, as organizational psychologist Anthony Klotz points out, the trend toward remote work may be a double-edged sword for American workers because it accelerates outsourcing: “In the United States, employees tend to be paid higher wages than people in many other countries. If you’re a remote organization, you can recruit workers from all over the world who can do the same job for a cheaper rate.”9

While white-collar workers opted to work remotely, other employees who worked in restaurants and stores simply chose not to go back when lockdown ended but generous unemployment aid continued. It was part of a phenomenon that came to be known as the Great Resignation, a term coined by Klotz. In some ways this was by design: the Biden administration explicitly pointed to its desire to make employers “compete” in order to pay higher wages. Yet even as places like Burger King and Walmart offered the highest pay packages in their history for new employees, the new employees didn’t show up. As I write, the ratio of job openings to job seekers is at an all-time high.10 A recent analysis in the Wall Street Journal suggests that the US’s more generous unemployment benefits than other countries’ contributed both to its lower labor-force participation rate and, because of fewer workers to help meet demand, higher inflation.11

You would predict that people start going back to work when the unemployment benefits stop. But we’re not seeing that happen—at least not yet, as of this writing. Why? Because people got accustomed to the idea of not working and enjoyed it enough to stop working for longer than they could afford.

Nowhere was the new laziness movement better epitomized than the subreddit r/antiwork, which became the place for supporters of the Great Resignation to unite. Its user population exploded during the pandemic, going from 180,000 in October 2020 to more than 1.6 million by January 2022. As the New York Post summed it up, the forum is a place where “People post epic text and e-mail screenshots of quitting their jobs, but the real heroes are so-called ‘idlers’—those who stay in jobs doing the absolute minimum to get by while still collecting a paycheck.”12 Examples include a user who bragged about getting paid $80,000 a year to answer one or two phone calls and an IT professional who created a simple script to perform their entire job and received $90,000 per year.

The Post interviewed the subreddit’s moderator, Doreen Ford, a thirty-year-old part-time dogwalker who identifies as nonbinary and transgender. She said the antiwork movement’s goal “is to reduce the coercive element of labor as much as possible by subverting capitalism.” A noble sentiment. “What we call laziness is actually people reaching their limits for very good reasons that are outside of their control,” Ford explained, adding “Well, some of us are lazy and we just don’t want to work.”13 The movement suffered a setback when Ford gave a disastrous interview on Fox News, showing up unwashed and disheveled in a dirty room, opining “I think laziness is a virtue in a society where people constantly want you to be productive 24/7, and it’s good to have rest.”14 After the interview, old Facebook posts from Ford surfaced where she admitted to having inappropriate sexual encounters on more than one occasion, blaming some of her actions on her unmet sexual needs.15 Doreen Ford was in one sense a very poor representative of the antiwork movement and, in another, perhaps the perfect one, a caricature who turned out to be real.

The pro-laziness movement launched by the pandemic dovetailed nicely with a growing clamor for the government to forgive student loan debt; repaying debts is hard work, after all. Once again, the federal government began by offering a moratorium on student loan repayments under President Trump,16 and once again, people expected that temporary aid to become permanent. The moratorium was repeatedly extended under President Biden, and more and more activists and lawmakers called for it to become an outright cancellation of debt.

Progressive “Squad” politicians such as Representatives Alexandria Ocasio-Cortez, Rashida Tlaib, Ilhan Omar, and Jamaal Bowman began framing student debt cancellation as a matter of progressive racial justice, even as all of them owed tens of thousands of dollars or, in some cases, hundreds of thousands.17 Somehow, they were perfectly able to understand the idea of conflicts of interest when they demanded a ban on lawmakers trading stock, yet unable to grasp the concept while simultaneously demanding their student loans be forgiven.18 It brings to mind another of Juvenal’s famous critiques: who watches the watchmen?19

When the Biden administration was slow to respond, Ocasio-Cortez made the subtext of the old bread-and-circuses bargain explicit: if you want people to vote for you, cancel their debt. She tweeted that Democratic politicians were “delusional” for not realizing they had to make that deal in order to stay in power.20 She was, as the saying goes, saying the quiet part out loud.

Meanwhile, according to government estimates, even the pause on student debt repayment has cost taxpayers more than $100 billion, losing another $4–5 billion in interest payments each month until the moratorium is lifted.21 Yet somehow, American culture now maintains that it is right and good for students to purchase expensive educations and require others to foot the bill. The notion of paying back money you borrowed is now considered outdated, perhaps even systemically racist. An analysis from the Brookings Institute, for instance, argued that the existence of a racial wealth divide necessitates the full cancellation of student debt.22

Victimhood fits laziness like a glove. Note that even when antiwork superstars like Doreen Ford explicitly defend laziness as a virtue, they have to justify it by saying indolence is an appropriate response to capitalism’s exploitation—stealing from your employers can’t be just laziness or greed; it has to be part of a grand fight for justice. In the same way, when progressive politicians and activists argue that taxpayers ought to cover their student loans, to avoid the charge that it’s just laziness they need a victimhood narrative like systemic racism to give them cover. A good victimhood narrative dresses up naked self-interest until it looks like nobility. It allows you to pretend to fight for others as you fight for nothing but yourself.

Here’s the thing about the new culture of laziness in America: it’s not limited to the workers. It’s spread to corporations, especially those that enjoy monopolies, as many of today’s big tech companies do.

How can that be? Google in the 1990s actually had to compete with other firms in order to make a profit. It had to hire the best and brightest to develop algorithms that eventually outcompeted Yahoo, AOL, and others. But today, it doesn’t much matter if the engineers at Google do a good or bad job writing an incremental piece of code. It’ll still print the same billions in monopoly profit from its search and ad businesses either way. That creates a culture where it doesn’t matter if people show up to work or not, or how productive they are.

If you’re a manager at Google, you care more about the diversity of gender identities on your engineering team than the quality of the products you build. Google’s monopoly ensures that no one can compete with it on its core products, but other major companies can compete over diversity, equity, and inclusion rankings, so that’s the new arena it has to focus on. If government aid is the bread that keeps us satiated, identity politics is the American circus, the coliseum where we compete with each other to find purpose and status.

Conventional wisdom holds that startup companies might be a disciplining force in the marketplace because they have their back against the wall and are struggling for survival. But this too is no longer the case in an economy overflowing with liquidity. Today a well-written business plan on a piece of paper can command pre-money valuations in the tens of millions. Normally, an entrepreneur wouldn’t want to raise more capital than needed, because it would result in unnecessary early dilution—but if they’re able to command a high valuation justified only by a business plan written up on a piece of paper, they figure they might as well. So that takes the same culture of excess to startups too. Now they can afford to hire not only the best engineers, but a few diversity hires too, and a proper DEI department—all before they know whether they have a product people can actually use.

These increases in asset valuations accrue to the rich and never trickle down to the poor because they’re fueled by an open-handed Fed instead of increases to productivity. As the market’s regulators prop it up through loose monetary policy, the assets of the rich become worth more while cash, the asset of the middle class and poor, becomes worth less. In this way, too, we mirror Rome. When asked to describe the arc of the Roman Empire succinctly, historian Ramsay MacMullen simply said, “Fewer had more.”23

The End of Easy Money

A market crash serves as a much-needed shock to the new culture of laziness that has permeated the American economy—from workers who grew accustomed to staying home to investors who grew accustomed to stocks only going up to monopolies who grew accustomed to printing profits no matter how they performed to young founders who grew accustomed to plentiful capital. I think a market crash is the medicine that we need, and against the right cultural circumstances, could help recreate a national identity built around excellence rather than entitlement.

In the face of now persistent (not “transient”) inflation, we need a more cautious fiscal and monetary policy that spends less and tightens monetary supply. As I write, the Federal Reserve is finally embarking on this course, tightening monetary policy in ways it hasn’t for over a decade. Some argued that this would create a greater risk of a short-term market crash. They were correct. My view is that the market crash of 2022 will ultimately be a good thing for the country—not just to fight inflation, but to combat the cultural lethargy that the days of easy money have brought.

The last time the United States faced hyperinflation was in the late 1970s. In the early 1980s, Paul Volcker did what he needed to do by rapidly raising interest rates to curb the problem. Yes, this had a deflationary effect on asset prices. Normally, that sounds like a bad thing, but under Volcker that was actually the intended effect; the alternative of unchecked hyperinflation would’ve been far worse.

But there’s one crucial difference between then and now: Volcker’s policies were paired with the otherwise business-friendly, deregulatory policy atmosphere set by President Ronald Reagan, which allowed fundamental economic growth to offset the deflationary impact of rising interest rates. By contrast, as the Fed faces the need to raise interest rates in 2022 and 2023, it does so against the backdrop of President Biden’s interventionist policies: greater regulations, the prospect of higher taxes, and historically large spending bills. These tactics create an even higher risk of an economic crash when paired with tighter monetary policy to stave off hyperinflation. This is part of the reason I believe an economic crash—including but not limited to the stock market crash—is even likelier today than in Volcker’s time, if inflation grows so out of control we’re forced to adopt his draconian measures.

There’s another difference too. Reagan inspired a greater sense of cultural confidence in the idea of the United States itself than Biden does, both within and beyond America’s shores. That confidence in the American spirit translates directly into confidence in American currency.

Both in the 1980s and today, the US dollar was backed by nothing more than the full faith and credit of the United States government. The value of that promise in an uncertain world made our dollar the world’s reserve currency—something that values the dollar more richly in global currency markets than would otherwise be the case. As I explained in chapter 5, this is a precious status that we can’t afford to squander.

But the cultural circumstances that allowed the dollar to thrive as the reserve currency under Reagan no longer exist today: Americans at home and foreigners abroad have simply lost confidence in America itself. There’s no other way to say it. America’s citizens doubt its stability; other countries doubt its word, as they watch us stagger from one regime to the exact opposite every couple years. And that may deal the biggest economic blow of all—one whose consequences have been covered up in the short run by a soaring stock market, a speculative ESG bubble, and government handouts. But once the Federal Reserve adopts tighter monetary policy this time around, America may find itself with less cultural confidence in the nation. Our nation’s cultural decline threatens to become a vicious cycle where the perception of decline itself creates heavy economic costs.

There’s no guarantee that the dollar will ever return to its status as the world’s reserve currency. The International Monetary Fund now accepts the Chinese yuan as a reserve currency alongside the US dollar. America’s debt load continues to rise as we spend money we don’t have. Most important of all, however, is an image problem—both in the eyes of America’s observers abroad and our own citizens at home—of a nation that is in decline. Cultural and psychological attitudes aren’t the stuff of classical macroeconomic analysis, but they form the backdrop against which an economy ultimately thrives.

Speaking as one such American, I think the loss of confidence is warranted—for all of the reasons that I lay out in this book. A nation of victims is unlikely to navigate turbulent geopolitical waters. The nation that defeated the Soviet Union during the Cold War wasn’t one that thought of itself as a victim on the global stage, nor one whose own citizens viewed themselves as each other’s victims.

But there’s an optimistic note to all of this: the cultural revival that I call for in this book will have economic benefits too. And ironically I think that it may take an economic crash that ends the American era of bread and circuses, the days of easy money. When hard times come, we’ll either sink or become strong. I think that crash has begun. I’m optimistic that we’ll rise to meet the challenge.

How Victimhood Makes Inequality Worse

So the main way victimhood causes national decline is by making the economic pie smaller as everyone focuses on grabbing as much of it as they can instead of growing it. American national success breeds laziness through the abundant flow of easy money, and that laziness in turn breeds victimhood as the idle argue that they deserve to receive money without working for it. As citizens squabble over who most deserves government beneficence, the nation declines culturally through division, and economically through lack of productivity. Then at a certain point when both types of decline become noticeable, the nation’s fall becomes a self-fulfilling prophecy as everyone loses confidence in it. As the rest of the world doubts our stability, we lose our economic advantages.

But there’s a natural counterargument to this line of thinking: So what if victimhood lowers the GDP? So what if it makes the economic pie smaller? Maybe that’s not the metric that matters most. If arguing over the fairest way to split the proceeds of American success results in a more just distribution, or even one that’s perceived to be more just, maybe it will make the happiness pie grow even as it makes the economic one shrink.

Well, when victimhood is the tool people use to justify getting a greater share of the economic pie, their happiness probably suffers; it can’t make people happy being in a constant state of grievance. But it is true that human happiness is greatly affected by perceptions of fairness in distribution of goods, not just by the absolute level of material goods people have. In fact, plenty of experimental evidence indicates notions of fairness are so fundamental, they’re actually a feature not only of humans, but primates in general and some other animals. The seminal article on this is Sarah Brosnan and Frans de Waal’s “Monkeys Reject Unequal Pay,” which reports on the researchers’ discovery that capuchin monkeys would refuse to participate in an experiment once they saw other monkeys receive greater rewards for performing the same tasks.24

There’s a funny viral YouTube video where Brosnan and de Waal show a short clip from the original experiment. A monkey gives the researcher a rock, is rewarded with a slice of cucumber, and is perfectly content to eat it. Then she witnesses the monkey next to her give a rock and be rewarded with a much more desirable grape. She gives another rock to the researcher, and when she receives cucumber again, she throws it back at the scientist and rages at him, reaching for the grapes and shaking the walls of her cage. He repeats the procedure, giving another grape to the other monkey, and when he returns to the first one she tests her rock before giving it to him. You can practically see her thinking—“I’m doing the same thing, right?” Then she once again throws the cucumber slice she receives back with great disdain and shakes her cage in fury.25 No doubt the antiwork crowd would take it as a perfect representation of capitalism.

After that 2003 article, experiments revealed similar aversion to inequality in dogs, wolves, rats, crows, and ravens, although small sample sizes and confounding factors sometimes delivered mixed results. The moral of the story seems to be that the more a species has evolved to succeed through cooperation, the more its happiness is affected by the perception of unfair inequality. Notably, in all of these experiments there seemed to be wide variation between individuals. Even in animals that care about fairness, some care a lot more than others.

There is a rich field attempting to study the same phenomenon in humans using survey data to deduce the relationship between national income inequality and individual and national happiness. Not coincidentally, this work was launched at the same time as inquiry into animal conceptions of fairness—starting in 2003, economists Thomas Piketty and Emmanuel Saez began drawing intense scrutiny toward income inequality in the United States, using detailed analysis of tax return data to argue that over the last few decades gains in income have overwhelmingly gone to the ultra-rich.26 According to their most recent analysis, the top 1 percent of earners’ post-tax share of national income rose from 9.1 percent in 1979 to 15.7 percent in 2014.27

Their focus on the very top earners, the one percent, was the animating force behind both the Occupy Wall Street movement and the growing academic focus on income inequality. That academic literature generally agrees that a country’s net happiness tends to be lower as income inequality is higher, although there’s robust debate about whether that’s because the poor are so unhappy that they drag the average down, or whether everyone is less happy. A 2022 study attempting to resolve this dispute found that there are elements of truth to both theories: when income inequality is high, every group except the very rich is less happy, and the poorer someone is, the less happy they tend to be.28

This evidence suggests that it isn’t simply an absolute lack of material goods that makes those with less unhappy, since even those with plenty tend to be less happy when inequality is higher—like the capuchins and other cooperative social animals, human happiness seems to be tightly tied to perceptions of relative wealth.

Interestingly, some chimpanzees, bonobos, and capuchins even objected to experiments when they received better rewards than their partners, not worse ones, a much rarer phenomenon. That was actually what launched the whole field of inquiry. Brosnan first wondered about animal conceptions of fairness when she was distributing peanuts to a bunch of low-status capuchins, and a high-ranking male monkey traded her a prized orange to be given a mere peanut as well, a startling display of capuchin noblesse oblige. That trend suggests that humans, the most intelligent, most cooperative primates of all, might display an even stronger aversion to unfair inequality, sometimes even when it favors them. That theory is consistent with the evidence human happiness studies have yielded so far.

There are, however, a couple of difficulties with reaching the conclusion that inequality makes most humans less happy. First, as with animal studies, the study of how inequality affects human happiness is still a young field, plagued by confounding variables; assessing different countries with different economic systems in different time periods can yield different conclusions. But even more significantly, there is a theoretical mismatch between the questions the animal studies and human ones tend to focus on: the animal experiments show that cooperative animals reject unfair inequality, while most human studies simply look at the question of how inequality itself correlates with happiness. This may be in part because social scientists can’t easily take up the question of how fair the economic systems they assess are.

A paper called “Income Inequality and Happiness” from University of Virginia professor Shigehiro Oishi and others addresses both of these difficulties.29 It’s also particularly useful for my purposes because the authors focus specifically on America, using survey data from 1972 to 2008 to track the relationship between Americans’ happiness and national income inequality. They concluded that all but the richest Americans were happier in times of lower income inequality and, importantly, that the relationship was explained by considerations of fairness: “Americans perceived others to be less fair and trustworthy in the years with greater income disparity, and this perception in turn explained why Americans reported lower levels of happiness in those years.”30 The authors suggest that these results help answer the question of why Americans haven’t become happier since the 1960s as national wealth has skyrocketed, even as massive economic growth in European countries did correlate with increased happiness.31 In short, it seems Americans are made unhappy by inequality because we attribute it to unfairness, activating our deep-seated biological aversion to unfair inequities.

That may be because capitalist systems have a natural tendency to concentrate wealth, especially the American one. The shift from discussing income inequality toward wealth inequality really took off with Thomas Piketty’s 2014 book Capital in the Twenty-First Century, which argued that unchecked capitalism tends to produce wealth inequality so vast that it outweighs whatever’s happening with income.

Essentially, Piketty argues that when the rate of return on invested capital exceeds the rate of a society’s economic growth, wealth from investments will accrue faster than wealth from labor, which steadily widens the gap between rich and poor. He assembles detailed empirical evidence to argue that in most capitalist societies during the last two centuries, the rate of return on invested capital hovers around 5 percent, while GDP growth is substantially less. That means growing wealth inequality is the default state of affairs, only disrupted by exceptional events like world wars and the Great Depression. That growing wealth gap is perpetuated by inheritance, Piketty argues, which results in an informal oligarchy. The two main forces that can resist this trend to inequality are deliberate government redistribution of wealth and high economic growth.32

Liberals often take Piketty’s work to justify a wealth tax, and that is one of the main policies he himself argues for at the end of his book. But he also argues for hefty inheritance taxes, which would short-circuit the capitalist path to oligarchy that he fears. Later in this book, I’ll argue for the superiority of inheritance taxes over wealth taxes as part of our return to a merit-based culture of excellence. Another alternative to a wealth tax is to simply do a better job guaranteeing equality of opportunity. Remember, human and animal studies demonstrate that it’s not inequality itself that we object to, it’s unfair inequality, and ensuring citizens have equal opportunities to succeed would rightly create a perception of fairness, even in the presence of wealth inequality. That’s actually a core part of John Rawls’s theory of justice, which was the dominant liberal conception of fairness before Piketty changed the conversation. I’ll return to the Rawlsian argument later as well.

My point for now is that the era of easy money and the culture of laziness it enabled both supercharged Piketty’s problem. When discussing the rise of wealth inequality in the United States over the last couple decades, Piketty correctly points out that returns on capital accelerated even as economic growth slowed, but he fails to explain why that happened. My answer is that a combination of loose monetary policy in the late 1990s and post-2008 era and overly generous fiscal policy like government loans to buy houses in the 2000s both contributed heavily to the phenomenon of high returns on capital alongside low economic growth. Loose monetary policy artificially inflates the assets of the rich without genuinely increasing productivity, which exacerbates wealth inequality both through inflation (which decreases the value of the assets of the poor) and by leading to the imbalance between investment and labor returns that Piketty identifies.

When laziness and victimhood enter the equation and people compete to receive government handouts instead of working, economic growth suffers even more. By squabbling over how to split the economic pie instead of growing it, then, we may actually be making wealth inequality worse. Even if we did manage to achieve some more just distribution in the short term, by Piketty’s reasoning, if we did it in a way that slowed growth more than returns on invested capital, we would only be changing who the haves and have-nots are, and creating more have-nots.

Reducing both economic factors to the same level might solve the inequality problem, but at the cost of leaving everyone worse off. That’s the solution to inequality Kurt Vonnegut satirized in “Harrison Bergeron.” He imagined a version of America so obsessed with equality that it constitutionally mandated it in every respect, dragging everyone down to the lowest common denominator by forcing the beautiful to wear ugly masks, the strong to wear heavy weights, and the smart to listen to thought-disrupting radios.33 One wonders if some modern progressives might think Vonnegut was describing a utopia. We should not infer from Piketty’s theory of inequality that the grand solution is to carefully adjust returns on capital and economic growth downward to the same level. The solution to inequality can’t be to have everyone stand in breadlines; that’s been tried before, and no one liked it. Boris Yeltsin despaired for the Soviet Union when he visited an American supermarket.34 That tour played a major role in his capitalist reforms once he rose to power.

Piketty’s theory gives us insight into a condition any realistic strategy for achieving greater wealth equality must meet: if it sacrifices too much economic growth, it’s probably just a way of changing peoples’ positions on the ladder of wealth. Switching which monkey gets the cucumber and which gets the grape doesn’t actually solve much.

How Victimhood Stifles National Dialogue

A victimhood narrative isn’t just a story a person tells about themself. It spreads and dictates what the entire culture can say. For a prime example of this, consider a case where American victimhood has made us resurrect the legacy of a Roman emperor and redefine it. Let’s talk about Septimius Severus, whom the narrative of black victimhood has recently decided to dub “the black emperor.”

That wasn’t the Romans’ title for him; it’s one contemporary society seems to be bestowing. You can read about him in books like Severus: The Black Caesar,35 or if you have a more academic bent, you can read articles like “Rediscovering the Lost Roman Caesar: Septimius Severus, the African and Eurocentric Historiography,” in the Journal of Black Studies.36 Or maybe you’d learn about him in school in Black History Month.37 If you want, you could just watch the adventure TV version of The Black Caesar. “The first black man to set foot on British soil came not as a slave, but as Emperor,” the synopsis breathlessly declares.38

Quite an accomplishment. It’s almost enough to make you forget that Septimius was a ruthless general who often sold his defeated enemies into slavery, as he did when he conquered Parthia and enslaved one hundred thousand women and children.39 In fact, I think that in today’s America, the fact that Septimius Severus had slightly darker skin than the average Roman (maybe one shade, not three) really is enough to make it irrelevant that he enslaved people. We will make him some symbol of blackness rising above slavery, and we won’t teach that he enslaved a lot of people because that doesn’t fit the narrative. The black emperor is the hero we need.

Plenty of Americans wouldn’t even have seen Septimius Severus as black, by the way—he had Italian Roman ancestry on his mother’s side and descended from Punics on his father’s. Out of curiosity, reading about the debate over Septimius Severus’s race made me wonder why Hannibal, himself fully Punic, isn’t known as the black general who led an African empire. I then found that there is indeed a longstanding effort to declare Hannibal a hero of black history.40 There was an uproar when a History Channel documentary portrayed him as black.41 When the question came up in an ancient history classroom at the University of Pennsylvania, Professor William C. McDermott quipped, “Yes, Hannibal was as black as King David.”42 Carthaginians were Phoenicians, who had Semitic ancestry and spoke a language similar to Hebrew. So maybe we can call Hannibal a great Jewish general and Septimius Severus the Jewish emperor, if we must translate ancient Carthaginian ancestry into the language of modern identity politics. Representation matters, after all.

Romans just didn’t see race through the strange lens America uses. They documented Hannibal and Septimius Severus’s characters, but didn’t think to mention their skin colors. None of Septimius Severus’s contemporaries would’ve thought of him as black; they didn’t see anyone that way. Although it’s hard for the American mind to understand this, though the Romans were of course able to see that some people had darker skin than others, to them that was a fact little different from some people having darker eyes or hair than others. Just as we don’t divide the world into green-eyed people, brown-eyed ones, and so on, the Romans didn’t think skin color had much value in explaining who people were or how they behaved. Where an American would see a person by their skin color, a Roman would see them by their nationality. While if we traveled through time we might look at Septimius Severus and see him as black or brown, the Romans would have looked at his clothes, heard his speech, and seen him only as Roman.43

It reminds me of something a guy once told a college friend of mine while he was visiting South Africa. “You call black people in your country African-Americans and ‘people of color’ and all these other names,” he said. “You know what we call them here? Americans.”

The great irony in remembering Septimius Severus for being the first black emperor is that the narrative that compels us to remember him requires us to forget his deeds. We remember him for a superficial feature of his identity instead of any of the things he would’ve wanted to be remembered for, just as we often see each other for our skin color instead of any of the things we want to be seen for. He would’ve wanted to be remembered for being a soldier and a general, a conqueror and a reformer of the military. The fact that his conquest often involved slavery would make his preferred narrative conflict with our uses for him. To substantially increase military pay and throw games to keep the populace happy, he—you’ll be shocked to hear this—massively diluted the silver purity of the denarius, reducing it to 50 percent.

The black emperor is the emperor we need, but the emperor we deserve to learn about is the general who kept the empire together but continued the debasement of the denarius to fund his wars and appease the public. Not a black emperor or a white one; just Roman like all the others.

In 211 CE, Septimius Severus died of illness in Eboracum, a Roman province on an island at the northern edge of the empire. The people of that island still remember him through sardonic Hogwarts professor Severus Snape. Some fans present intriguing evidence that J. K. Rowling hints that Snape may even be a descendant of his namesake. For instance, Eboracum, renamed North Yorkshire after Vikings conquered it, holds a village called Snape, which has the ruins of a Roman villa thought by some to be Septimius Severus’s.44

Reading Septimius Severus as black reminds me of queer readings of characters, where a consumer of a work of art creates their own narrative about a character’s sexual orientation irrespective of what the author intends. It gets to the whole death-of-the-author debate. Perhaps there’s something similar going on when Harry Potter fans read one of their favorite characters as the descendant of Roman royalty. But it’s one thing to read Severus Snape as a descendant of Septimius Severus, and another to read Septimius Severus as black. Whether the author is dead to the reader is an interesting theoretical question about the bargain behind fiction, and about the bargain behind constitutional law, for that matter. But history has no author.

Victimhood has become so entrenched in American culture that it even determines the way we talk to each other and the things we can say. It determines which historical figures we remember and how we remember them. It determines who gets to talk first, or who gets to talk at all.45 It determines how we address each other, which often means that when we disagree on pronouns, we simply don’t talk to each other or about each other. I’ve noticed that the frequent conservative uncertainty over which pronouns to use for transgender people sometimes leads them to simply avoid talking about them, especially in mixed political company, to avoid charges of bigoted language.

When we Americans do talk to each other, victimhood increasingly determines what we’re allowed to say. We all know there’s a new category of things you just can’t say. That’s even what we call them. What we mean by this label is that you just can’t say some ideas no matter how much evidence you provide for them or how politely you state your position. You can’t share some opinions no matter how well you warrant their truth, because the reasons you can’t share them have nothing to do with whether they’re true or not. Even the reason we can’t say some things—that they oppose favored progressive victimhood narratives—is something you just can’t say, which is why this category is presented as a primitive, a fundamental fact with no further justification.

Many things you just can’t say fall into the broad category of wondering if different groups might have innately different traits in some ways. As Larry Summers found out when he was fired as president of Harvard, you just can’t say that there’s greater variability among men’s scientific ability than women’s; you certainly can’t say that men naturally tend to be better or more interested in science. As a conservative who talks to liberals, I occasionally listen to them talk to me privately, in hushed tones, preface their comment with how good a liberal they are, then quietly confess that they suspect there might be some biological differences in the mental traits of sexes or even races. They always swear me to secrecy, as if they’re relieved to have a friend they can confess these forbidden thoughts to, someone to whom they can finally say the things you just can’t say. It makes me feel like a priest. Personally, I’m skeptical that there’d be much natural variation in the intellectual abilities of races defined by skin color, since, unlike skin color, there are hundreds, possibly thousands of genes that affect the many traits we associate with intelligence. That point convinces some people. If the topic weren’t forbidden, they could’ve been convinced sooner.

The category of things you just can’t say grows as America’s victimhood complexes do. I believe, for example, that you now cannot say that women have vaginas, or that mothers are women. This linguistic update came to public attention when Representative Cori Bush described mothers as “birthing people.”46 When people objected, progressives said that it was an isolated incident from a single politician; a month later, the Biden administration replaced the word “mothers” with “birthing people” in its 2022 budget.47 Language changes very quickly these days, because the moment we see a new progressive term, most of us adopt it, reasoning that it’s far better to be ahead of the trend than behind it. This principle of “better safe than sorry” is even truer for organizations than individuals, since they face more scrutiny and risk. Organizations playing it safe then pass new rules about what you can say down to individuals who play it safe.

This stuff would almost be funny if the consequences for the nation weren’t so serious. The inevitable consequence of the growing category of things you just can’t say is that you don’t voice your true views in public, leading to national policies that a majority don’t actually believe in, but everyone must be seen to believe in. You can’t, for instance, point out the hypocrisy between saying sexual orientation is immutably inherited (despite the total absence of a gay gene), while saying that gender is completely fluid over one’s life (despite the existence of X and Y chromosomes). So we will create public policies founded on contradictory beliefs, which can only lead to new injustice. A just-published paper argues, for instance, that the growing use of “birthing people” in the medical profession will have harmful health consequences for women by obscuring their unique biological needs.48 Meanwhile, in private, we increasingly associate only with people who think like us because they’re the only ones we can speak freely to. Then our beliefs are reinforced. All the while, resentment grows.

America’s victim complexes continue to proliferate as we desperately use them to grab pieces of a shrinking pie of resources. We compete with each other for higher rank on the ladder of victimization, we talk to each other less and less honestly for fear of offending ever-changing rules, we associate only with those who share our beliefs, and our grievances against each other fester and grow. Where will this lead? Nowhere good, of course, but what exact shape will our decline take? On its current trajectory, will our nation fizzle, or go out with a bang?

I see more and more Americans discussing the possibility of a second civil war, both publicly and privately. As I write, two books on the subject have just come out, Stephen Marche’s The Next Civil War and Barbara F. Walter’s How Civil Wars Start. Walter’s book has been especially influential, since she’s a leading expert on civil war who’s studied it for over thirty years. Walter argues that the three factors of growing distrust in democracy, increasing factionalism, and rising racial resentment all bode very poorly for the US. She sums it up with “Where is the United States today? We are a factionalized [autocratic democracy] that is quickly approaching the open insurrection stage, which means we are closer to civil war than any of us would like to believe.”

Other scholars of civil war disagree, and there’s a robust and growing debate on the topic. I won’t take a stand on the question, because that’s a book-length issue in its own right, but what I do know is that the more we speak of civil war, the more likely it becomes. There are even growing accelerationist movements in America trying to hasten our descent into civil war. The most prominent of them calls itself the “Boogaloo Bois.” In true American fashion, it uses the term “bois” instead of “boys” to make clear that it’s gender neutral.49

The group is a loose collection of white supremacists, militia members, anti-police activists, and others, united only by their desire to bring about a second American civil war—the boogaloo, jokingly named after the obscure sequel movie Breakin’ 2: Electric Boogaloo. The movement uses memes and inside jokes to avoid being taken seriously (its favored uniform is a Hawaiian shirt), but in recent years, the group has finally worked up enough nerve to become violent. At around the same time, one member killed a security contractor and a police officer in California,50 others were involved in a plot to kidnap Michigan governor Gretchen Whitmer,51 and others set fire to a police precinct in Minneapolis in a successful effort to make the George Floyd riots more violent.52

An Atlantic interview of JJ McNab, a fellow with the Program on Extremism at George Washington University who has studied antigovernment extremists for more than twenty years, closed by asking her why the movement wanted to bring about violent revolution. “They want Rome to fall,” she responded. “They want chaos to bring it down.” Movements like the Boogaloo Bois are an effort to make war speak itself into existence. I write this book in the hope that peace can work the same way.

I was talking to a friend about Rome once, as one does, and he pointed out that the common folk of Europe in the Middle Ages must have known that they lived in the ruins of a much greater civilization. “It’s all the empty granaries,” he said. His point was that by simply looking out into the fields and seeing all the ancient, unused granaries everywhere, it must’ve been obvious to people that long ago their lands were far more populated. And like the Old English poet marveling at the ruins of Bath, they would’ve known that some ancient power once built towers far beyond their modern capabilities. The work of giants.

We tell a narrative of human history where it’s always progressing, where we humans are always improving, becoming more knowledgeable, more civilized. We tell ourselves that the arc of our species bends toward progress. It’s easy for us to tell ourselves that narrative because, in our day, technology really does seem to inevitably progress. Just as it’s hard for us Americans to imagine how the Romans truly didn’t see race, it’s hard for us to imagine the post-Roman period where the people living in Rome’s ruins would’ve seen the arc of human history as one of decline, as one where knowledge and skill were inevitably lost.

Are we Rome? When I look at America sometimes, I can’t help but remember what Dio Cassius said, the rare person to recognize his own empire’s decline: “Our history now descends from a kingdom of gold to one of iron and rust.” This was a kingdom of gold, once, and now it’s becoming rust and iron. It’s not our buildings that have decayed, but our freedoms, and our identities too, reduced from complex wholes to primitive parts. But we can look at our history and see that something has been lost.

It’s hard to see an empire’s decline while it’s happening, when it’s been going on for decades or centuries and it’s all you’ve ever known. But then you look one day and you find yourself out on open fields, surrounded by ruins and empty granaries. Things no longer work, and there are fewer people who know how to repair them.

It’s easy to despair, once you see the ruins around you, and to begin to think that you belong to a narrative of inevitable decline. But those farmers would’ve been wrong if they’d concluded humanity had declined. The poet who wrote “The Ruin” was wrong when he thought the Romans had died out a hundred generations earlier. Rome was still flourishing even then, though it was far from Britain.

Rome’s decline began in earnest when its age of merit ended and ushered in nepotism, cronyism, corruption, and spectacle—bread and circuses. After Septimius Severus and his family’s brief dynasty held the empire together for a while longer through military might, Rome fell into the Crisis of the Third Century, a wartorn period where the empire nearly collapsed, fifty years with twenty-six emperors. Even this wasn’t the end for Rome. Aurelian reunited the empire, and it limped along for another two hundred years before falling to barbarians.

The Western Roman Empire fell, at least. In 285 CE, Aurelian’s successor, Diocletian, had divided the empire into eastern and western halves because it had grown too large to be manageable. And so when the western empire fell late in the fifth century CE, Rome lived on. We sometimes miss this story because just as we impose the narrative that Septimius Severus was black, our label “Byzantine Empire,” a categorization we create to understand the world, makes us see it incorrectly. The people we label Byzantines just saw themselves as Roman.

The Eastern Roman Empire lasted another thousand years before falling to the Ottomans in 1453 CE. So, you see, whether Rome lasted hundreds of years or thousands is simply a matter of definition. Maybe America will be the same way.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!