Part I

What Is the Metaverse

Chapter 1

A Brief History of the Future

THE TERM “METAVERSE” WAS COINED BY AUTHOR Neal Stephenson in his 1992 novel Snow Crash. For all its influence, Stephenson’s book provided no specific definition of the Metaverse, but what he described was a persistent virtual world that reached, interacted with, and affected nearly every part of human existence. It was a place for labor and leisure, for self-actualization as well as physical exhaustion, for art alongside commerce. At any given time, there were roughly 15 million human-controlled avatars on “The Street,” which Stephenson called “the Broadway, the Champs Elysees of the Metaverse,” but stretched across the entirety of a virtual planet more than two and a half times the size of the earth. As a point of contrast, there were fewer than 15 million total users of the internet in the real world the year Stephenson’s novel was published.

While Stephenson’s vision was vivid and, to many, inspiring, it was also dystopic. Snow Crash is set at some point in the early 21st century, years after a global economic collapse. Most layers of government have been replaced by for-profit “Franchise-Organized Quasi-National Enti­ties” and “burbclaves,” a contraction of the term “suburban enclaves.” Each burbclave operates as a “city-state with its own constitution, a border, laws, cops, everything”1 and some even provide “citizenship” purely based on race. The Metaverse offers refuge and opportunity to millions. It was a virtual place where a pizza deliverer in the “real world” could be a talented swords­man with inside access to the hottest clubs. But Stephenson’s novel was clear: in Snow Crash the Metaverse has made life in the real world worse.

As with Vannevar Bush, Stephenson’s influence on modern technology only grows with time, even if he is mostly unknown to the public. Conversations with Stephenson helped inspire Jeff Bezos to found the private aerospace manufacturer and suborbital spaceflight company Blue Origin in 2000, with the author working there part-time until 2006, when he became a senior advisor to the company (a position he still holds). As of 2021, Blue Origin is considered the second most valuable company of its kind, ranked only behind Elon Musk’s SpaceX. Two of the three founders of Keyhole, now known as Google Earth, have said their visions were informed by a similar product described in Snow Crash, and that they once tried to recruit Stephenson to the company. From 2014 to 2020, Stephenson was also “Chief Futurist” at Magic Leap, a mixed reality company that was also inspired by his work. The company later raised over half a billion dollars from corporations including Google, Alibaba, and AT&T, attaining a peak valuation of $6.7 billion, before struggles to realize its vaulting ambitions resulted in a recapitalization and the departure of its founder.* Stephenson’s novels have been cited as the inspiration for various cryptocurrency projects and non-cryptographic efforts to build decentralized computer networks, as well as the production of CGI-based movies which are watched at home but generated live through the motion-captured performance of actors that might be tens of thousands of miles away.

Despite his far-reaching impact, Stephenson has consistently warned against a literal interpretation of his works—especially Snow Crash. In 2011, the novelist told the New York Times that “I can talk all day long about how wrong I got it”2 and, when asked about his influence on Silicon Valley by Vanity Fair in 2017, he reminded the publication to keep “in mind that [Snow Crash was written] pre-Internet as we know it, pre-Worldwide Web, just me making shit up.”3 As a result, we should be wary of reading too much into Stephenson’s specific vision. And while he coined the term “Metaverse,” he was far from the first to introduce the concept.

In 1935, Stanley G. Weinbaum wrote a short story titled “Pygmalion’s Spectacles,” about the invention of magical VR-like goggles that produced a “movie that gives one sight and sound . . . you are in the story, you speak to the shadows, and the shadows reply, and instead of being on a screen, the story is all about you, and you are in it.”†4 Ray Bradbury’s 1950 short story “The Veldt” imagines a nuclear family in which the parents are supplanted by a virtual reality nursery that the children never want to leave. (The children eventually lock their parents inside the nursery, which then kills them.) Philip K. Dick’s 1953 story “The Trouble with Bubbles” is set in an era where humans have explored deep into outer space, but never succeeded in finding life. Yearning to connect with other worlds and life-forms, consumers begin to buy a product called “Worldcraft” through which they can build and “Own [Their] Own World,” which are cultivated to the point of producing sentient life and fully realized civilizations (most Worldcraft-owners eventually destroy their worlds in what Dick described as a “neurotic” “orgy of breaking” intended to “assume some god suffering from ennui”). A few years later, Isaac Asimov’s novel The Naked Sun was published. In it, he described a society where face-to-face interactions (“seeing”) and physical contact are considered both wasteful and repugnant, and most work and socializing takes place via remotely projected holograms and 3D televisions.

In 1984, William Gibson popularized the term “cyberspace” in his novel Neuromancer, defining it as “A consensual hallucination experienced daily by billions of legitimate operators, in every nation. . . . A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding.” Notably, Gibson called the visual abstraction of cyberspace “The Matrix,” a term repurposed by Lana and Lilly Wachowski 15 years later for their film of the same name. In the Wachowskis’ movie, the Matrix refers to a persistent simulation of the planet earth as it was in 1999, but which all of humanity is unknowingly, indefinitely, and forcibly connected to in the year 2199. The purpose of this simulation is to placate the human race so that it can be used as bioelectric batteries by the sentient, but man-made, machines which conquered the planet in the 22nd century.

The Program Is More Optimistic than the Pen

Whatever the differences among each specific author’s visions, the synthetic worlds of Stephenson, Gibson, the Wachowskis, Dick, Bradbury, and Weinbaum are all presented as dystopias. Yet there is no reason to assume that such an outcome is inevitable, or even likely, for the actual Metaverse. A perfect society tends not to make for much human drama, and human drama is the root of most fiction.

As a point of contrast, we can consider the French philosopher and cultural theorist Jean Baudrillard, who coined the term “hyperreality” in 1981 and whose works are often linked to those of Gibson, and those Gibson influenced. Baudrillard described hyperreality as a state in which reality and simulations were so seamlessly integrated that they were indistinguishable. Though many find this idea frightening, Baudrillard argued that what mattered was where individuals would derive more meaning and value—and speculated it would be in the simulated world.5 The idea of the Metaverse is also inseparable from the ideas of the Memex, but where Bush imagined an infinite series of documents linked together via words, Stephenson and others conceived infinitely interconnected worlds.

More instructive than Stephenson’s texts and those which inspired them are the many efforts to build virtual worlds over the past several decades. This history not only shows a multi-decade progression towards the Metaverse, but also reveals more about its nature. These would-be Metaverses have not been centered on subjugation or profiteering, but on collaboration, creativity, and self-expression.

Some observers date the history of “proto-Metaverses” to the 1950s during the rise of mainframe computers, which represented the first time that individuals could share purely digital messages with one another across a network of different devices. Most, however, start in the 1970s with text-based virtual worlds known as Multi-User Dungeons. MUDs were effectively a software-based version of the role-playing game Dungeons & Dragons. Using text-based commands that resembled human languages, players could interact with one another, explore a fictional world populated by non-playable characters and monsters, attain power-ups and knowledge, and eventually retrieve a magical chalice, defeat an evil wizard, or rescue a princess.

The growing popularity of MUDs inspired the creation of Multi-User Shared Hallucinations (or MUSHs) or Multi-User Experiences (MUXs). Unlike MUDs, which asked players to carry out specific roles in the context of a specific and usually fantastical narrative, MUSHs and MUXs enabled participants to collaboratively define the world and its objective. Players might choose to set their MUSH in a courtroom, while taking on roles such as defendant, attorney, plaintiff, judge, and members of the jury. One participant might later decide to transform the relatively mundane proceedings into a hostage situation—which would then be diffused by a poem that was mad-libbed by the other players.

The next great leap came in 1986 with the release of the Commodore 64 online game Habitat, which was published by Lucasfilm, the production company founded by Star Wars creator George Lucas. Habitat was described as “a multi-participant online virtual environment” and, in a reference to Gibson’s novel Neuromancer, “a cyberspace.” Unlike MUDs and MUSHs, the world of Habitat was graphical, thereby allowing users to actually see virtual environments and characters, though only via pixeled 2D. It also afforded players far greater control over the in-game environment. “Citizens” of Habitat were in charge of the laws and expectations of their virtual world, and had to barter with each other for necessary resources and avoid being robbed or killed for their wares. This challenge led to periods of chaos, after which new rules, regulations, and authorities were established by the player community to maintain order.

Though Habitat is not as widely remembered as other 1980s video games, such as Pac-Man and Super Mario Bros., it transcended the niche appeal of MUDs and MUSHs, ultimately becoming a commercial hit. The title was also the first game to repurpose the Sanskrit term “avatar,” which roughly translates to “the descent of a deity from a heaven,” to refer to a user’s virtual body. Decades later, this usage has become convention—in no small part because Stephenson reapplied it in Snow Crash.

The 1990s saw no major “proto-Metaverse” games, but advances continued. That decade, millions of consumers took part in the first isometric 3D (also known as 2.5D) virtual worlds, which gave the illusion of three-dimensional space, but only allowed users to move across two axes. Not long after, full 3D virtual worlds emerged. A number of games, such as 1994’s Web World and 1995’s Activeworlds, also empowered users to collaboratively build a visible virtual space in real time, rather than through asynchronous commands and votes, and introduced a number of graphic/symbol-based tools to make world-building easier. Notably, Activeworlds also had the express purpose of building Stephenson’s Metaverse, asking players to not just enjoy its virtual worlds, but to invest in expanding and populating it. In 1998, OnLive! Traveler launched with spatial voice chat, which allowed users to hear where other players were positioned relative to other participants, and for an avatar’s mouth to move in response to the words spoken by the player.6 The following year, Intrinsic Graphics, a 3D gaming software company, completed the spinoff of Keyhole. While Keyhole did not become broadly popular until the middle of the next decade and after its acquisition by Google, it represented the first time anyone on earth could access a virtual reproduction of the entire planet. In the ensuing 15 years, much of the map was updated to partial 3D and connected to Google’s much larger database of mapping products and data, enabling users to also overlay information such as real-time traffic.

It was with the launch of (the aptly named) Second Life in 2003 that many, especially those in Silicon Valley, began to contemplate the prospect of a parallel existence that would take place in virtual space. In its first year, Second Life attracted over one million regular users, and shortly thereafter, numerous real-world organizations established their own businesses and presences inside the platform. This included for-profit corporations such as Adidas, BBC, and Wells Fargo, as well as nonprofits such as the American Cancer Society and Save the Children and even universities, including Harvard, whose law school offered exclusive courses inside Second Life. In 2007, a stock exchange was launched on the platform with the aim of helping Second Life–based companies raise capital using the platform’s Linden Dollars currency.

Crucially, developer Linden Labs did not intermediate transactions in Second Life, nor actively manage what was made or sold. Instead, transactions were made directly between buyers and sellers and based on perceived value and need. Overall, Linden Labs operated more like a government than a game-maker. The company did provide some user-facing services, such as identity management, ownership records, and an in-world legal system. But its focus wasn’t on building out the Second Life universe directly. Instead, it enabled a thriving economy via ever-improving infrastructure, technical capabilities, and tools that would attract more developers and creators who would then create things for other users to do, places for them to visit, and items for them to buy—attracting more users and therefore more spending, which would in turn attract more investment from developers and creators. To this end, Second Life also offered users the ability to import virtual objects and textures made outside the platform. By 2005, just two years after it launched, Second Life’s annualized GDP exceeded $30 million. By 2009, it exceeded half a billion dollars, with users cashing out $55 million into real-world currency that year.

For all the success of Second Life, it was the rise of virtual world platforms Minecraft and Roblox that brought its ideas to a mainstream audience in the 2010s. In addition to offering significant technical enhancements compared to their predecessors, Minecraft and Roblox also focused on children and teenage users, and were therefore far easier to use, rather than just offer greater capabilities. The results have been astounding.

Throughout the 2010s, bands of users collaborated in Minecraft to build cities as large as Los Angeles—roughly 500 square miles. One video game streamer, Aztter, constructed a stunning cyberpunk city out of an estimated 370 million Minecraft blocks, having worked an average of 16 hours per day for a year.7 Scale is not the sole achievement of the platform. In 2015, Verizon built a cellphone inside Minecraft that could make and receive live video calls to the “real world.” As the COVID-19 virus spread across China in February 2020, a community of Chinese Minecraft players rapidly re-created the 1.2-million-square-foot hospitals built in Wuhan as a tribute to the “IRL” (“in real life”) workers, receiving global press coverage.8 One month later, Reporters Sans Frontières (also known as Reporters Without Borders) commissioned the construction of a museum within Minecraft that was composed of over 12.5 million blocks assembled by 24 virtual builders in 16 different countries over some 250 hours combined. The Uncensored Library, as it was called, allowed users in countries such as Russia, Saudi Arabia, and Egypt to read banned literature, as well as works promoting free speech and detailing the lives of journalists such as Jamal Khashoggi, whose murder was ordered by political leaders in Saudi Arabia.

By the end of 2021, more than 150 million people were using Minecraft each month—more than six times as many as in 2014, when Microsoft bought the platform. Despite this, Minecraft was far from the size of the new market leader, Roblox, which had grown from fewer than 5 million to 225 million monthly users over that same period. According to Roblox Corporation, 75% of children ages 9 to 12 in the United States regularly used the platform in Q2 2020. Combined, the two titles amassed more than 6 billion hours of monthly usage each, which spanned more than 100 million different in-game worlds and had been designed by over 15 million users. The Roblox game with the most lifetime plays—Adopt Me!—was created by two hobbyist players in 2017 and enabled users to hatch, raise, and trade various pets. By the end of 2021, Adopt Me!’s virtual world had been visited more than 30 billion times—more than fifteen times the average number of global tourism visits in 2019. Furthermore, developers on Roblox, many of whom are also small teams with fewer than 30 members, have received more than $1 billion in payments from the platform. By the end of 2021, Roblox had become the most valuable gaming company outside of China, worth nearly 50% more than storied gaming giants Activision Blizzard and Nintendo.

Despite the enormous growth in Minecraft’s and Roblox’s audiences and developer communities, many other platforms began to emerge and grow towards the tail end of the 2010s. In December 2018, for example, the blockbuster video game Fortnite launched Fortnite Creative Mode, its own riff on Minecraft’s and Roblox’s world-building. Meanwhile, Fortnite was also transforming into a social platform for non-game experiences. In 2020, hip-hop star (and Kardashian family member) Travis Scott hosted a concert that was attended live by 28 million players, with millions more watching live on social media. The track Scott premiered during the concert, which featured Kid Cudi, debuted at #1 on the Billboard Hot 100 charts a week later, was Cudi’s first #1 track, and finished 2020 as the third-largest US debut of the year. In addition, several of the tracks Scott performed from his two-year-old Astroworld album returned to the Billboard charts after the concert. Eighteen months later, Fortnite’s official event video had accumulated nearly 200 million views on YouTube.

The multi-decade history of social virtual worlds, from MUDs to Fortnite, helps explain why the ideas of the Metaverse have recently shifted from science fiction and patents to the forefront of consumer and enterprise technology. We are now at the point when these experiences can appeal to hundreds of millions and their bounds are more about the human imagination than technical limitation.

In mid-2021, only weeks before Facebook unveiled its Metaverse intentions, Tim Sweeney, CEO and founder of Fortnite maker Epic Games, tweeted prerelease code from the company’s 1998 game Unreal, adding that players “could go into portals and travel among user-run servers when Unreal 1 was released in 1998. I remember a moment where folks in the community had created a grotto map with no combat and were standing in a circle chatting. This style of play didn’t last for long though.”9 A few minutes later, he added: “We’ve had metaverse aspirations for a very, very long time . . . but only in recent years have a critical mass of working pieces started coming together rapidly.”10

This is the arc of all technological transformations. The mobile internet has existed since 1991, and was predicted long before. But it was only in the late 2000s that the requisite mix of wireless speeds, wireless devices, and wireless applications had advanced to the point where every adult in the developed world—and within a decade, most people on earth—would want and be able to afford a smartphone and broadband plan. This in turn led to a transformation of digital information services and human culture at large. Consider the following: when instant messaging pioneer ICQ was acquired by internet giant AOL in 1998, it had 12 million users. A decade later, Facebook had over 100 million monthly users. By the end of 2021, Facebook had 3 billion monthly users, with some 2 billion using the service daily.

Some of this change, too, is a result of generational succession. For the first two or so years following the release of the iPad, it was common to see press reports and viral YouTube videos of infants and young children who would pick up an “analogue” magazine or book and try to “swipe” its nonexistent touchscreen. Today, those one-year-olds are eleven to twelve. A four-year-old in 2011 is now well on her way to adulthood. These media consumers are now spending their own money on content—and some are already creating content themselves. And while these previously unintelligible consumers now understand why adults found their futile efforts to pinch-to-zoom a piece of paper so comic, older generations are not much closer to understanding how the worldviews and preferences of the young differ from their own.

Roblox is the perfect case study of this phenomenon. The platform launched in 2006 and roughly a decade passed before it had much of an audience. Another three years went by before non-players really noticed the title (and those who did largely scoffed at its low-fidelity graphics). Two years later, it was one of the biggest media experiences in history. This 15-year timeline is partly a result of the technical improvements, but it’s no coincidence that Roblox’s core users are the very children who grew up “iPad Native.” The success of Roblox, in other words, required other technologies to influence how consumers thought, in addition to enabling it in the first place.

The Coming Fight to Control the Metaverse (and You)

Over the past 70 years, “proto-Metaverses” have grown from text-based chats and MUDs to vivid networks of virtual worlds with populations and economies that rival small nations. This trajectory will continue in the decades to come, bringing more realism, diversity of experiences, participants, cultural influence, and value to virtual worlds. Eventually, a version of the Metaverse as imagined by Stephenson, Gibson, Baudrillard, and others will be realized.

There will be many wars for supremacy in and over this Metaverse. They will be fought between tech giants and insurgent start-ups through hardware, technical standards, and tools, as well as content, digital wallets, and virtual identities. This fight will be motivated by more than just revenue potential or the need to survive the “pivot to Metaverse.”

In 2016, a year before his company’s release of Fortnite and long before the term “Metaverse” entered public consciousness, Tim Sweeney told reporters: “This Metaverse is going to be far more pervasive and powerful than anything else. If one central company gains control of this, they will become more powerful than any government and be a God on Earth.”§11 It is easy to find such a statement hyperbolic. The provenance of the internet, however, suggests that it may not be.

The foundation of today’s internet was built over several decades and through a variety of consortiums and informal working groups composed of government research labs, universities, and independent technologists and institutions. These mostly not-for-profit collectives typically focused on establishing open standards that would help them share information from one server to another, and in doing so make it easier to collaborate on future technologies, projects, and ideas.

The benefits of this approach were far-ranging. For example, anyone with an internet connection could build a website in minutes and at no cost using pure HTML, and even faster using a platform like GeoCities. A single version of this site was (or at least could be) accessed by every device, browser, and user connected to the internet. In addition, no user or developer needed to be disintermediated—they could produce content for, and speak to, anyone they wanted. The use of common standards also meant that it was easier and cheaper to hire and work with outside vendors, integrate third-party software and apps, and repurpose code. The fact that so many of these standards were free and open-source meant that individual innovations often benefited the entire ecosystem, while placing competitive pressures on paid, proprietary standards, and helping to check the rent-seeking tendencies of platforms sitting between the web and its users (e.g., device manufacturers, operating systems, browsers, and ISPs).

Importantly, none of this prevented businesses from making a profit on the internet, deploying a paywall, or building proprietary technology. Rather, the “openness” of the internet enabled more companies to be built, in more areas, reaching more users, and achieving greater profits, while also preventing pre-internet giants (and, crucially, telecom companies) from controlling it. Openness is also why the internet is largely considered to have democratized information, and why the majority of the most valuable public companies in the world today were founded (or were reborn) in the internet era.

It’s not difficult to imagine how different the internet would be if it had been created by multinational media conglomerates in order to sell widgets, serve ads, harvest user data for profits, or control users’ end-to-end experience (something AT&T and AOL both tried but failed to pull off). Downloading a JPG could cost money, and a PNG could cost 50% more. Video calls might have only been possible through a broadband operator’s own app or portal—and only to those who also had that same broadband provider (imagine something like, “Welcome to your Xfinity Browser™, click here for Xfinitybook™ or XfinityCalls™ powered by Zoom™; Sorry, ‘Grandma’ is not in our network, but for $2, you can still call her . . .”). Imagine if it took a year or a thousand dollars to make a website. Or if websites only worked in Internet Explorer or Chrome—and you had to pay a given browser an annual fee for the privilege of using it. Or maybe you would have to pay your broadband provider extra fees to read certain programming languages or use a given web technology (imagine, again, “This website requires Xfinity Premium with 3D”). When the United States sued Microsoft in 1998 for alleged antitrust violations, it centered its case on Microsoft’s decision to bundle Internet Explorer, the company’s proprietary web browser, with the Windows operating system (OS). Yet if a corporation had created the internet, is it conceivable that it would have even allowed a competing browser? If so, would it have allowed users to do whatever they wanted on these browsers, or access (and modify) whichever sites they chose?

A “corporate internet” is the current expectation for the Metaverse. The internet’s nonprofit nature and early history stem from the fact that government research labs and universities were effectively the only institutions with the computational talent, resources, and ambitions to build a “network of networks,” and few in the for-profit sector understood its commercial potential. None of this is true when it comes to the Metaverse. Instead, it is being pioneered and built by private businesses, for the explicit purpose of commerce, data collection, advertising, and the sale of virtual products.

What’s more, the Metaverse is emerging at a time when the largest vertical and horizontal tech platforms have already established enormous influence over our lives, as well as the technologies and business models of the modern economy. This power partly reflects the profound feedback loops in the digital era. Metcalfe’s Law, for example, states that the value of a communication network is proportional to the square of the number of its users, a relationship that helps to keep large social networks and services growing and presents a challenge to upstart competitors. Any business based on artificial intelligence or machine learning benefits from similar advantages as their datasets grow. The primary business models of the internet—advertising and software sales—are also scale-driven, as the companies that sell another ad slot or app encounter almost no incremental cost from doing so, and both advertisers and developers focus primarily on where consumers already are, rather than where they might be.

But to secure their user and developer bases while also expanding into new areas and blocking potential competitors, the tech giants have spent the past decade closing their ecosystems. They’ve done this by forcibly bundling together their many services, preventing users and developers from easily exporting their own data, shutting down various partner programs, and stymying (if not outright blocking) for-profit and even open standards which might threaten their hegemony. These maneuvers, mixed with the feedback loops that come from having comparatively more users, data, revenue, devices, etc., have effectively closed much of the internet. Today, a developer must essentially receive permission and provide payment. Users have little ownership of their online identity, data, or entitlements.

It is here that fears of a Metaverse dystopia seem fair, rather than alarmist. The very idea of the Metaverse means an ever-growing share of our lives, labor, leisure, time, wealth, happiness, and relationships will be spent inside virtual worlds, rather than just extended or aided through digital devices and software. It will be a parallel plane of existence for millions, if not billions, of people, that sits atop our digital and physical economies, and unites both. As a result, the companies that control these virtual worlds and their virtual atoms will likely be more dominant than those who lead in today’s digital economy.

The Metaverse will also render more acute many of the hard problems of digital existence today, such as data rights, data security, misinformation and radicalization, platform power and regulation, abuse, and user happiness. The philosophies, culture, and priorities of the companies that lead in the Metaverse era, therefore, will help determine whether the future is better or worse than our current moment, rather than just more virtual or remunerative.

As the world’s largest corporations and most ambitious start-ups pursue the Metaverse, it’s essential that we—users, developers, consumers, and voters—understand that we have agency over our future and the ability to reset the status quo. Yes, the Metaverse can seem daunting and scary, but it also offers a chance to bring people closer together, to transform industries that have long resisted disruption and that must evolve, and to build a more equal global economy. This leads us to one of the most exciting aspects of the Metaverse: how poorly understood it is today.

* The company’s valuation was ultimately reduced by more than two-thirds, with the company’s investors hiring Peggy Johnson, a long-time executive vice president at Qualcomm and Microsoft, to lead as CEO. It is during this time that Stephenson left the company, along with many other full-time employees and other chief officers.

Pygmalion is a reference to the mythological Cypriot king Pygmalion. In Ovid’s epic poem Metamorphoses, Pygmalion carves a sculpture so beautiful and lifelike that he falls in love with and marries her; the goddess Aphrodite transforms her into a living woman.

When asked about Baudrillard in April 1991, Gibson said, “He’s a cool science-fiction writer” (Daniel Fischlin, Veronica Hollinger, Andrew Taylor, William Gibson, and Bruce Sterling, “‘The Charisma Leak’: A Conversation with William Gibson and Bruce Sterling,” Science Fiction Studies 19, no. 1 [March 1992], 13). The Wachowskis tried to involve Baudrillard in their film, but he declined and later described the film as a misread of his ideas (Aude Lancelin, “The Matrix Decoded: Le Nouvel Observateur Interview with Jean Baudrillard,” Le Nouvel Observateur 1, no. 2 [July 2004]). When Morpheus introduces the film’s protagonist to the “real world,” he tells Neo “As in Baudrillard’s vision, your whole life has been spent inside the map, not the territory.” (Lana Wachowski and Lilly Wachowski, The Matrix, directed by Lana Wachowski and Lilly Wachowski [1999; Burbank, CA: Warner Bros., 1999], DVD.) Recall, too, Tencent’s original name for its Metaverse vision: “hyper-digital reality.”

§ In its ruling for Epic Games, Inc. v. Apple Inc., the district court wrote “[It] generally finds Mr. Sweeney’s personal beliefs about the future of the metaverse are sincerely held” (Epic Games, Inc. v. Apple Inc., U.S. District Court, Northern District of California, Case 4:20-cv-05640-YGR, Document 812, filed September 10, 2021).

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!