Chapter Twenty-Three

Feeding the Future

In one area of the world economy, resources remain infinite and the crisis of overproduction can never happen. Nevertheless, the phenomenon of intellectual property still bears the shape of its landed ancestor, especially in the assumption that the owner has a natural right to its exclusive possession that the law must protect. And because there appears to be no limit to the unknown world that might be divided up into patentable ideas, it seemed, until recently, as though a frontier territory were perpetually opening up ahead that might be settled, marked out, and turned into property simply by the efforts of the pioneer.

The tumultuous growth of intellectual property can be measured through the frequency of the term’s use in U.S. federal courts—from an occasional reference up to the 1970s, it suddenly burgeoned to more than two hundred in the 1980s, then went through the roof, more than quadrupling in each of the next two decades—and in the figures conjectured for its economic value. In America, measuring the value of all industries heavily dependent on intellectual property in 2011, it was deemed to have created about 40 million jobs and to be worth five trillion dollars, or more than 34 percent of GDP. Using the narrower definition of direct investment, the British government estimated the value of intellectual property in 2011 at sixty-five billion pounds, more than one hundred billion dollars, or about 10 percent of GDP, and noted that during the previous decade investment in it had become greater than in business hardware.

All such figures are rendered suspect, however, by the vagueness of the term, and the constant extension of new claims to property rights in such unexpected areas as the lighting of the Eiffel Tower, the identification of a gene, and the shopping habits of a credit card user. One trend, however, is unmistakable—a corporate goal to make property out of anything that has required original mental activity. Without much thought, the trend has been backed by both courts and governments on the assumption that it will stimulate innovation with benefit to all. That assumption is loosely based on the phenomenal success of private property economies following the transformation in the eighteenth century of patents from royal monopoly into propertied ownership.

However, as the early industrialists understood, the connection between patent ownership and innovation was not direct. In reality, inventiveness was initially stimulated less by the law’s defense than by the wide dispersal of information that resulted from the reluctance of judges to grant innovators a monopoly on the use of their ideas. A patent could only be justified, said England’s Chief Justice Mansfield, if it was for something “substantially and essentially newly invented,” and if “the specification [accompanying the claim for a patent] is such as instructs others to make it.”

Even in the United States, where patents were more easily granted, the 1836 Patent Act made clear that such a monopoly could only be in the public interest if in return the inventor provided a model to show how the idea worked and a description written “in such full, clear and exact terms . . . as to enable any person skilled in the art or science to which it appertains . . . to make, construct, compound and use the same.” As late as 1863, the German congress of economists questioned whether the innovation was worth the cost of the monopoly, and concluded “that patents of invention are injurious to the common welfare.” Although every industrialized country adopted the 1886 Berne Convention on the protection of copyrights and its gradual extension to cover patents, well into the last quarter of the twentieth century the inadequacy of legal protection available to inventors and industrialists left many reluctant to file for patents for fear that a rival might copy the specification, modifying it sufficiently to avoid being sued for infringement.

A sea change occurred in the course of the 1980s and 1990s. Not only were defenses strengthened, they were extended from inventions and copyrights to cover trade secrets, distinctive designs, and trademarks. Between 1983 and 2010, the number of patents issued each year in the United States increased fourfold from 64,000 to 244,000, and in the United Kingdom from just over two thousand to more than ten thousand. Under pressure from the motion picture industry, the life of copyrights was extended from twenty-eight years after an author’s death to seventy years, while other major corporate interest groups persuaded the courts that the appearance of a fast-food restaurant and the development of a surgical procedure should be judged to be a property exclusive to its creator. Any lingering belief that the public interest might be harmed by such monopolies disappeared.

During the Uruguay round of trade agreements concluded in 1994, the World Trade Organization brought into existence the concept of TRIPS or Trade-Related Aspects of Intellectual Property Rights. Every signatory nation was required to accept as intellectual property anything deemed to be so in another country “without discrimination as to the place of invention, the field of technology and whether products are imported or locally produced.” Only where patents endangered public order or the environment could a national government modify the agreement.

Without public discussion, one particular kind of ownership was embodied in the TRIPS agreement and planted in the legal structure of every signatory nation of the WTO. Ignoring past concerns that monopolies constituted a restraint of trade, the TRIPS agreement asserted bluntly and fallaciously, “intellectual property rights are private rights” that had to be protected. Varieties of communal ownership of knowledge, such as the use of the periwinkle plant in traditional folk medicines in Madagascar, were deemed ineligible for protection, while the private, exclusive property that the pharmaceuticals company Eli Lilly claimed for its research into the plant after learning of its medical properties was instantly recognized, allowing the company to develop products from it worth two hundred million dollars worldwide. With TRIPS, the idea of private exclusive property that first appeared in relation to land in sixteenth-century England could be said to have conquered the world.

However, in the very year that the TRIPS agreement dispersed this system around the globe, the central edifice of intellectual rights threatened to evaporate through the even faster expansion of digitized information across the World Wide Web. In a prescient article written in 1994, John Perry Barlow, Grateful Dead lyricist and cofounder of the Electronic Frontier Foundation, warned of the impact of being able to express all information in binary form: “If our property can be infinitely reproduced and instantaneously distributed all over the planet without cost, without our knowledge, without its even leaving our possession, how can we protect it?”

Music, originally acoustic then electronic, proved to be especially vulnerable and attractive to copiers once it had been digitized, giving rise to a succession of file-sharing sites from Napster in 1999 to Megaupload and Pirate Bay in 2012 that arose to allow tracks and albums to be freely shared as MP3 files. Images removed from celluloid could be passed on almost as easily. Although lawyers for the artists and industries concerned sued for breach of copyright and succeeded in closing down the most prominent sites, their extension of property rights to digitized information was achieved at high cost to the public interest.

The problem was identified by Barlow, who pointed out that the nature of digitized property, in essence information about something in the physical world, rendered it so elusive that establishing ownership risked opening up an endless legacy chain as later information was discovered to have incorporated earlier information. Consequently, a successful claim to ownership of an early part of the chain had the potential to give ownership of all later information connected to it and of the inventions that stemmed from it—a nightmare scenario that gave rise to a series of billion-dollar court battles over smartphone technology in 2012.

There were notable exceptions to individually owned intellectual property—Wikipedia made its information freely available, propagating rather than possessing it, and Linus Torvalds, the Finnish inventor of the Linux operating system used in Android smartphones, developed it as an open source, cooperative venture—but their rarity seemed to prove the rule. The individualistic culture that had built up around the strange sixteenth-century way of owning the earth had become the norm by which other values were measured.

When Microsoft executives attacked Linux as “cancerous” and “unAmerican” in 2001, they were demonstrating not only how seriously they took the threat to their own privately owned operating system, but also their deep-seated, ideological reaction to an alternative built on different values. The same conceptual barrier had led Senator Henry Dawes in 1877 to dismiss the Cherokee tradition of owning the land in common because it did not encourage individualism. “There is no enterprise to make your home any better than that of your neighbor’s,” Dawes declared. “There is no selfishness, which is at the bottom of civilization.”

Today, Dawes’s failure to appreciate the value of any other principle, especially in respect to Native American culture, seems old-fashioned, but the owners of intellectual property who persuaded public opinion, law courts and legislatures, as well as the WTO, that greater protection of their individual rights would lead to greater innovation and more enterprise were thinking in exactly the same way.

Counterintuitive though it may seem, there is no independently sourced evidence to show stronger patent rights promote inventiveness. In fact the reverse is true. Research into innovation in sixty countries by Harvard’s Professor Josh Lerner, covering a period of 150 years from the 1850s, showed that strengthening patents not only failed to encourage innovation, “the impact of patent protection-enhancing shifts on applications by residents was actually negative.” Following a more detailed examination of the spread of protection since the 1980s, Lerner described its effects as an “innovation tax.”

Despite the exponential rise in the number of patents, spending on research and development did not increase in either the United States or Britain, and the yields from each patent fell. The reason was simple. Every monopoly is a restraint of trade, and the more there are, the smaller the scope for new enterprises. Not only did the proliferation of patents and the spread of intellectual property hamper research in general, but the likelihood of expensive litigation deterred many firms from seeking to develop specific products. The effect on small and medium-sized enterprises was especially marked, since many decided to cut back on all research and development as a wasted expense. Among those firms that did continue to fund research, the budget for legal costs to defend their developments rose to at least 25 percent of what was spent on research. More insidiously, as even researchers in universities and government-funded institutions concede, what used to be a culture of sharing information on new discoveries has been replaced by one of secrecy for fear that a rival might patent results and so impede future development.

Digitization exaggerated the corrosive effect. In 1994, Barlow had warned that digitized property “would adhere to those who can muster the largest armies,” and in 2010 Lerner discovered that indeed “an arms race” had developed as “the ability to litigate and expect to get substantial awards from litigation increased.” In other words, the usefulness of any new idea had become of lesser importance than the chance of using the patent on it to pry damages from a rival or to knock the competition out of some potentially lucrative line of development. The damage done to the public domain by the extent of the monopolies never became an issue during these courtroom battles. Yet for all but the last few decades of inventive history, that had always ranked as a major concern.

Monopoly, by definition, damages competitiveness, but nowhere more destructively than in the realm of ideas. The consequences are already obvious in the pharmaceuticals industry. Using extensive political influence and the greater legal powers available to them since the early 1990s both domestically and internationally through TRIPS, pharmaceutical companies like Pfizer and Roche spent two decades successfuly defending wide-ranging patents on their products. But a whole generation of patents issued in the 1990s for blockbuster drugs, such as the acid reflux treatment Nexium and the antidepressant Abilify, with annual sales of $140 billion, have begun to expire—the industry terms it is a “patents cliff”—without any new breakthrough medicines to replace them.

Significantly, the serious work on most of these megaearners began in the 1980s, before the most draconian protection was in place, and was grounded often enough in still earlier research carried out in university laboratories where information was often freely shared. In other words, innovation flourished, as it did at the start of the Industrial Revolution, when patent support was weak, and the number of monopolies granted was small.

Nevertheless, it is easy to understand why the great barons of intellectual property should behave like their landed predecessors, the Polish szlachta and eighteenth-century British aristocracy, and use all their power to protect their estates against trespassers. The very nature of exclusive ownership rewards those who can best defend it. That is the prize that drives owners to seek political influence. But the efficiency of the marketplace, both political and economic, makes it vital that their defense should never succeed. No politician who believes in democracy or economist who supports free enterprise should back any extension of patent law.

In the five centuries since the first private property society began to emerge, the justification for this monstrous method of owning the earth has constantly veered between the arguments of Hobbes and Locke: either it is a creation of civil law enforced by the power of the state or it is the realization of a profound and inescapable sense of justice innate in humankind. The difference between the two remains implicit in John Locke’s original formulation. The point at which there is no longer enough and as good left over is the point where those rights no longer have a basis in natural justice, but simply in the corrosive impact of expensive lawyers and skewed courts.

Where the enclosure of intellectual property is concerned, the loss to the public interest may be slow and difficult to see, but in the ownership of the earth, the matrix for the entire structure of property law, the outcome is unmistakable. And as the world approaches the choke point of 2050, when nine billion lives must be sustained and sheltered from its finite resources, the basis on which its ownership rests will become critical.

The countdown began in 2008. Around the world, the cost of grain doubled within nine months, and other products, such as oilseed and pork, rose by more than two thirds. Although prices fell back in the next two years, another jump in 2011 to levels in some cases higher than before signified the end of the era of cheap food that began in the nineteenth century with the sailing of the Dunedin from New Zealand. “Our mindset was surpluses,” Dan Glickman, a former United States secretary of agriculture, remarked when the second surge occurred in 2011. “That has just changed overnight.”

Some causes were temporary—drought in Russia and Australia reduced wheat sales, while disease and floods in Indonesia and Thailand hit rice production—but one was systemic, the creation of bioethanol fuels derived primarily from maize and sugarcane. In 2005, legislation in the United States made its use mandatory, and within five years approximately 40 percent of American maize was being used as a gasoline alternative. Together with a constant proportion of biomass in Europe and South America diverted to generating energy in machines rather than human, it was enough to upset the balance between supply and demand.

Demonstrating the efficiency of the free-market, existing farmers planted more cereals in 2009 to take advantage of higher prices. But the potential profits also encouraged an expansion of interest from a new kind of owner. In June 2009, George Soros, doyen of international financiers, alerted investors to opportunities opening up beyond the tainted world of derivatives and securitization. “I am convinced that farmland is going to be one of the best investments of our time,” he declared. “Eventually prices will get high enough that the market probably will be flooded with supply either through development of new land or technology or both, and the bull market will end. But that’s a long ways away yet.” Putting his money where his mouth was, Soros backed a six hundred million dollar investment fund to buy more than six hundred thousand acres of land in Latin America.

Until 2008, large-scale land purchases averaged less than two million hectares, or 4.8 million acres, a year, but in 2009 alone almost 110 million acres of farmland, seventy million of them in Africa, were bought by corporate purchasers from the United States, Europe, the Middle East, and China. The area was greater than the total of agricultural land in Germany, France, Belgium, Denmark, the Netherlands, and Switzerland combined. In Liberia, where traditional family holdings of a few acres used to be the norm, the median size of privately owned farms in 2011 leaped to almost 130,000 acres. Although Sudan’s corporate holdings were smaller—a relatively modest median area of eight thousand hectares, almost twenty thousand acres, each—they amounted in total to almost four million hectares, or about one fifth of the country’s cultivated land.

In Russia, 45 percent of the cultivated area was corporately owned in 2012, with the thirty largest companies farming a total of 6.7 million hectares. The proportion was higher both in Kazakhstan at 60 percent, and in Ukraine at 55 percent, where just forty companies, many of them foreign, owned 13.5 percent of the country’s farmland in 2011. In southern Brazil, the median size of corporately owned sugar plantations was thirteen thousand hectares, and in Argentina the thirty leading agribusinesses, including Soros’s, owned on average eighty thousand hectares each of cattle, wheat, and oilseed land.

The most authoritative estimate of the amount spent—foreign land-grabbing is too sensitive a topic to be open to easy scrutiny—is that one hundred billion dollars have been invested up to 2012. Some of the money came from China, which signed about thirty agricultural cooperation treaties that gave food producing companies, such as New Hope, a giant producer of pork and poultry on an industrial scale, access to farmland in countries ranging from Kazakhstan and Mozambique to the Philippines and Queensland. Other major investors were Saudi and Middle Eastern sovereign funds, and commodity producers such as Brazil’s JBS, the world’s biggest meat company, and Malaysia’s Sime Darby, the world’s largest source of palm oil. And a growing proportion, up to fifteen billion dollars in 2012, came from institutional investors in Europe and the United States, such as the $650 million Altima One World Agriculture Fund, which boasted of its aspiration to be “the first Exxon Mobil of the farming sector.”

Investors were offered bonanza returns of up to 20 percent as profits from food production, supplemented by the almost forgotten factor of farmland’s growing capital value. “The first thing we’re going to do is to make money off of the land itself,” promised Susan Payne of Emergent Asset Management, a British investment fund targeting farmland in Africa, in 2009. “We could be moronic and not grow anything and we think we’d still make money over the next decade.”

Unlike the Jeffersonian model of the yeoman capitalist, however, twenty-first century rural capital accrued not to the tiller of the soil, or even to a landlord in the same country, but to absentee international investors, including such powerful players as Japan’s $1.3 trillion government fund, the California Teachers’ $131 billion fund, and Canada’s $122 billion state pension fund.

The free-trade argument for encouraging Ethiopia, a country dependent on food aid, to sell prime agricultural land to the Spanish group Jittu Horticulture, which used it to grow and export 180,000 vegetables a week to the Middle East, was that it attracted investment to parts of the world that had not received it in the past. “We bring foreign currency into the country, enabling the government to buy wheat for the hungry,” Jittu’s manager in Ethiopia explained. “It’s the government’s responsibility to feed people who are unable to buy anything for themselves.”

However, as Winston Churchill had said, the nature of the commodity—its necessity for existence, its limited extent, and its immobility—set it apart from any other. Whether judged by Qianlong’s need for the mandate of heaven or the gut necessity of keeping its citizens quiet, no responsible government could afford to allow its territory to be owned for the benefit of foreigners when its own people were undernourished. Consequently, it was precisely in those worst-governed countries with no clear property law, where land was occupied tribally and worked by families, but nominally owned by the state, that it was easiest for foreign buyers to operate. As even the World Bank recognized “investor interest is focused on countries with weak land governance.”

The bank’s criticism inadvertently pointed to a basic flaw in the financial calculations behind the land grab. The weakness of such corrupt governments left them vulnerable to the resentment of their rural population. Dispossession caused as much anguish to African farmers as it did to Steinbeck’s dust bowl Oklahomans. And where the materialist goals of the investors sparked an upsurge in sectarian values, there was no certainty that any government at all would survive the explosion.

Curling through the arid, rock-ribbed landscape of the Sahara Desert, the clear flow of the river Niger that began as a cascade five hundred miles to the west in the mountains of Guinea slows to a swampy ooze in the northern regions of Mali, one of the world’s poorest states. Across a flat expanse of red dust and grit, it spreads out into a gigantic inland delta where green fields of maize and rice grow around a web of ditches and canals that irrigate about half a million acres of land. Like many Islamic desert societies, ownership evolved a communal pattern, with tribes claiming use of particular areas and controlling access to unirrigated pasture and reserve land, while specific parcels of land, measuring around three acres, were recognized as belonging to individual families though liable to redistribution by village elders according to need and to rank. Under French colonial rule, ownership of the water, without which the land was useless, was vested in a shadowy government body, the Office du Niger. In 2009, however, this traditional shape was upset by forces from outside.

The first of the outside elements was a sovereign fund from Colonel Muammur Gaddafi’s Libya that leased the water rights to a quarter of a million acres from the Office du Niger. It was followed by investors from the Saudi royal family, from China, and by the Millennium Challenge Corporation (the MCC) from the United States, who together secured leases covering about four hundred thousand acres. Some of this land was tribal pasture and some desert, but all had the promise of irrigation from the delta water. The leaseholders undertook to invest heavily in turning desert to productive arable land, and, with the exception of the MCC, all expected to export the maize, sorghum, and rice they grew there.

The MCC, created by the U.S. Congress in 2004, proposed to survey the almost four hundred thousand acres it leased, dividing it into parcels of twelve acres, twenty-five acres, and upward of seventy acres, for sale with individual title first to Mali’s farmers and then to anyone prepared to invest. In effect, the Public Lands Survey was being introduced to the Niger delta. Put together, the different schemes would provide the irrigation canals, roads, and secure storage silos needed by modern agriculture, more than doubling both the area of arable land and its productivity.

Threatened with dispossession, however, the delta farmers protested fiercely against both the attempt to take their land and to modernize their society. Instead, they proposed the allocation of existing parcels of land to the families who worked them, but with secure leasehold title to give an incentive to improve the soil’s fertility and conditions to prevent it being sold to outsiders. When the offer was ignored, the rebellious delta farmers promised Mali’s distant, corrupt government in Bamako that foreigners would not be allowed to take their land.

The significance to the wider world of this remote, anonymous disturbance became clear in March 2012. Tuareg tribesmen mounted on pickup trucks armed with the machine guns and rocket launchers they had employed in the successful coup against Colonel Gaddafi invaded the delta region, encountering no resistance from a farming community that in the past had always opposed a nomadic people they regarded as natural enemies. The Tuareg in turn were swiftly displaced by militant Islamists who set up an al-Qaeda-friendly regime, attracting recruits from all over west Africa and creating links between the jihadist group Boko Haram in Nigeria and fundamentalists in Algeria, Libya, and Egypt. The drugs trade moved in to fund their activities, and when the French intervened militarily in 2013 the jihadists scattered into the mountains to the north of the country to continue their mission.

The pattern of dispossession leading to rural upheaval that created an opening for terrorist activity was not new. In the late 1950s, the adoption of Rostow’s development strategy prompted a prolonged American attempt to modernize Afghan society through industrialization. At huge expense, a network of irrigation dams and canals were dug, and a hydro-electric power station was constructed in Helmand province to provide power for the entire region. It seemed such a model piece of development, the area was generally referred to as “Little America.” Unfortunately, lack of drainage and water salinity left much of the soil unfit for growing anything but opium poppies, while the rush of incomers to Helmand and the neighboring province of Kandahar removed control of the land from the old hierarchy of clan chiefs and regional warlords. After the expulsion of the Soviet army in 1992, tribal patterns reasserted themselves elsewhere in Afghanistan. But in those two disturbed provinces the only structure awaiting the returning mujahideen resistance fighters was a form of Islam opposed to any form of modernism. It was no coincidence that Little America should have become the birthplace of the Taliban and eventually home to Osama bin Laden.

In the future, land ownership and government are liable to be undermined in similar fashion as the pressure from the world’s rising population meets the growing buying power of corporate investors. The task of feeding nine billion people in the middle of the twenty-first century will create such a mass of urgent and seemingly insoluble problems, it might seem perverse to suggest that the most important is how the land is owned. But that will be the key to solving all the others.

In the twentieth century, the World Bank’s research on the experience of Taiwan, South Korea, and elsewhere showed that equitable land distribution was fundamental to social stability and thus the key to economic growth. Even in the twenty-first century, it will continue to be a robust model. More than three billion people still live in rural areas, and the poorest 50 percent of them depend for survival on the produce of small plots of land that for various reasons they cannot clearly own.

Even in India, one of the world’s fastest growing economies, two hundred million people still live undernourished, and a generation of children is growing up stunted and malformed for lack of food, because secure tenure of the land they work is almost impossible for “the meek and humble” small farmers who missed out on the Green Revolution. As a result, India actually produces less food than China despite possessing 40 percent more farmland.

Often security of possession is made impossible by the unwritten rules of tribal and clan possession requiring land to be redistributed according to need. But with perhaps one third of the world’s food grown on small farms and supplied through local markets, the laborious process of enabling the family that tills the soil to establish tenure in a form suitable to local conditions is worth making for the double gain of improving the food supply and creating social stability.

The effectiveness of the model depends, as Wolf Ladejinsky insisted, on obtaining local consent. In Mali, MCC signed a contract with the government to promote “economic liberalism”—in the first seven years of its life, it expended almost nine billion dollars on twenty-six similar agreements—and undertook to lay the foundation of a private property society by stipulating that land should be measured out and registered, with secure title to individual owners. The disaffection of the delta farmers showed that the goal should have been security of tenure rather than social engineering.

During the Cold War, the Green Revolution expanded the world’s grain output by 270 percent, from 692 million tons in 1950 to 1.9 billion tons in 1999. To feed the anticipated growth in population might require a further increase of one billion tons, approximately 50 percent, by 2050. Part of this addition could come from higher yields—wheat harvests alone could increase by half in optimum conditions—but more land could also be brought into cultivation. The countries of the former Soviet Union hold 13 percent of the world’s arable land, but are thought to produce just 6 percent of its grain and 3 percent of its meat. In 2008 less than half of Russia’s potential farmland was being worked, and, according to a United Nations estimate, Brazil still farms less than one sixth of its potential arable land. Yet the greatest single source of extra food is to be found in the waste that presently occurs in its growth, transport, storage, and marketing. By some estimates, about a quarter of total production is lost in this way, and small gains in efficiency would produce large savings. Put together, the three ingredients make it very probable that the earth can sustain another two billion inhabitants—if the water holds out.

Agriculture already consumes up to 70 percent of the water used by the world’s inhabitants, and irrigation needs will double existing requirements by 2050. More than 40 percent of global crops are grown on irrigated soil, but many sources are already under pressure from the demands of farmers. Great watercourses such as China’s Yellow River and America’s Colorado River are dry for much of the year in their lower reaches, and the largely invisible, underground supplies of water that are replenished over centuries have dropped by at least tens of meters, in some cases by up to a kilometer, since the Green Revolution. Shortages already affect cereal-rich areas in the North China plain, in the Punjab, known as India’s breadbasket, and in the southern plains of the United States. Some two billion people presently depend on food grown on irrigated soil that, according to some estimates, will by 2030 be capable of supporting only one billion.

Drought and flood, unseasonable heat or cold, quickly produce shortages in a system finely balanced between supply and demand. And the earth itself may not be able to sustain the levels of intensive cultivation required. The soil’s ecosystem becomes stressed in ways not wholly understood by excessive use of nitrogen-based fertilizers, depriving it of essential minerals and trace elements.

Each of these dangers exacerbates the global challenge of feeding the world’s population, and increases the need for innovation to be released from the shackles of intellectual property law. But while the challenge is global, starvation is local. The severest famine rarely affects a whole country or even everyone within the area.

In 1943, the Nobel Prize–winning economist Amartya Sen was a schoolboy in Bengal when famine struck the province, now divided between India and Bangladesh, killing an estimated three million people. Yet Sen recalled that his wealthy family were untouched by the shortages, and “no one in my school or among my friends and relations had experienced the slightest problem during the entire famine; it was not a famine that afflicted even the lower middle classes.” Nearly all who died were poor laborers without land to grow food or the ability to increase their wages to buy enough to keep themselves alive. Sen would later argue that famine was caused not by the shortage of food but by the inability of people to access it through the exchange of labor or money. Since supplies were always to be found somewhere in the world, famine represented a failure of government to make them available locally.

The imperative need for governments to respond to shortage and starvation is what puts the position of corporate landowners in the spotlight. Even the most corrupt administration would prefer to tear up its contract with a foreign owner rather than face hungry rioters in the streets. Nor would it be blamed in constitutional terms since the basic test of a government’s legitimacy is to meet its citizens’ entitlement to food. To force compliance, a corporate owner might in the past have turned to an international agency like the WTO, or to its own government, expecting to have sanctions applied to the defaulting government. But the potential for provoking insurrection and terrorism will bring into question the legitimacy of an absentee, foreign company’s claim to exclusive possession of another country’s vital and limited resource.

In other words, ownership based on a complex chain of international law cobbled together from civil, common, and commercial legal roots may be less substantial than it presently appears. But for centuries, another strategy has been adopted by landowners who needed to strengthen their claims to a scarce resource.

Disputes about water use date back at least to the growth of Babylonian civilization beside the Tigris and Euphrates Rivers in the third millennium BC. But so too do formal rules for resolving them. From the earliest times, excessive irrigation by an upriver farmer in a time of shortage has been seen to take place at the expense of another downriver, and the purpose of the rules has always been to regulate its use so that there is enough for both. In the arid southwest of the United States, agriculture and urban development continue to be governed by similar agreements, notably the 1922 Colorado River Compact specifying how much water should be released from the state of Colorado to users downriver in California, Arizona, and Nevada.

It is easy to suppose that rivalries, such as that between Turkey and Iraq about use of the Tigris and Euphrates Rivers, or between Vietnam and China about access to the headwaters of the Mekong River, must lead through intransigent demands to all-out war, but the evidence points the other way. Even three wars between India and Pakistan have not caused either of them to break the Indus Water Treaty they signed in 1960.

Underpinning these formal regulations for sharing water, as well as thousands of other informal arrangements, has been the understanding that it is worth limiting individual needs so that everyone benefits from a limited resource. However bitter the disputes, it is overshadowed by the realization that taking whatever one wants risks destroying the entire system and ruining everyone. Self-interest gives social awareness a sharp edge.

As domestic demand for land and its scarcity increase in the years ahead, foreign ownership will come under increasing pressure and ever closer scrutiny. From primarily selfish motives, most corporate investors will sooner or later realize that property based on state-enforced law looks less secure than the kind based on natural right. The basic Lockean premise is that such a right arises out of an innate sense of justice. On that basis a corporate owner’s claim to property in land must ultimately depend on finding a way to make good the loss to those deprived of its use.

The iron law of private property turns out to be a paradox. Although it promotes individuality, it only works by giving equal weight to the public interest. That was the premise that Adam Smith specified for the working of the invisible hand, and James Madison for the operation of democracy. And most fundamentally, it is what involves everyone in a society that will reward a few more than the majority. The guardian of the public interest might be the press, the law, or the government, but ultimately it grows from humanity’s simultaneous desire for individual fulfilment and for social justice.

If you find an error or have any questions, please email us at admin@erenow.org. Thank you!