A memo from the year 2050
Here’s how we avoided the worst
of zoonotic diseases
By Brandon Keim
Looking back at the early twenty-first century with the clarity of hindsight, it can be difficult to comprehend how societies with so much wealth, power, and knowledge could be so short-sighted and negligent about the threat of diseases emerging in animals and spreading to humans.
Not surprising, really, given how most people back then failed to confront climate catastrophe in any meaningful way. Yet unlike the relatively slow process of climate change in which each new flood or each temperature extreme could be relegated to coincidence, the appearance of new diseases could not be dismissed.
There was the AIDS pandemic, which spread along central African trade routes after some unknown person ate an infected primate; it killed 32 million people. The original SARS outbreak, with a mortality rate of 10 percent, was caused by a virus likely contracted from a captive camel—but it was heroically contained before it could spread widely. The H1N1 swine flu pandemic of 2009 claimed 350,000 lives in its first year. The list goes on and on.
In 2032, regular citizens looked at supermarket meat aisles and fast-food value meals as pandemic lottery tickets. Eating those foods carried a social stigma, not unlike how westerners a decade earlier had regarded bat-eating.
When scientists warned that three-quarters of new human diseases started in animals and that human activity made these so-called zoonoses spill over with greater frequency and destructiveness, the evidence was plain to see—for anyone who looked. Yet except for a frantic few researchers and public health experts, people turned away after looking. Life went on as usual.
Then the SARS-CoV-2 virus arrived and forced a reckoning. It followed a transmission chain that linked humans to pangolins and bats, possibly farmed pigs, and finally a wildlife market. COVID-19, the disease caused by the virus, killed more than 2 million people before a vaccine became widely available, and it caused economic damage that took a decade to heal. Nevertheless, it could have been much worse. COVID-19’s mortality rate was about one percent, or just one-tenth that of the original SARS outbreak. It gave humanity a second chance.
When life finally returned to quasi-normal, preparing for zoonoses became a bedrock part of public-health policy. Researchers traveled to cities and suburbs and throughout the countryside to collect DNA samples from animals, monitoring circulating pathogens and developing necessary drugs and vaccines in advance of spillover. Scientists modeled microbe flows across landscapes and species with the precision of seasonal weather forecasts, anticipating what a particularly wet spring or a population boom might mean for the spread of the disease. Initiatives such as PREDICT—a program that catalogued pathogens circulating among animals in disease hotspots and that had been
infamously canceled weeks before the COVID-19 pandemic—were radically expanded. No longer would the first line of defense between humanity and zoonosis-induced Armageddon operate on less money than it took to repair a medium-sized bridge.
These efforts didn’t take place in just a few high-risk places. They operated everywhere that people lived. Handling the threat of zoonotic diseases became a government function as basic as wildfire prevention or flood control, and—apart from averting the next world-shaking pandemic—these programs helped treat the slow burn of comparatively low-profile zoonoses. Diseases such as brucellosis, West Nile virus, and rabies rarely earned headlines for more than a news cycle, but collectively they killed nearly 3 million people each year in the pre-COVID-19 days.
©US Department of Agriculture
In the end, it simply proved easier to help wild animals than to kill them.
It wasn’t all amicable. Confronted with the fact that shorebirds and migratory waterfowl are influenza vectors, some people argued that these and other potential vector species should be exterminated rather than protected. People who shot birds for fun rebranded themselves as public-health volunteers. Even those who didn’t favor full-scale wildlife annihilation had misgivings about nurturing the nature-rich cities beloved by urbanists. Maybe migratory songbirds didn’t deserve to be driven extinct—but why invite them into your city, much less protect a suburban forest where tick-bearing rodents breed, or spend money on wildlife corridors that helped diseases disperse?
These were not easy conversations. Sometimes conservation and disease prevention dovetailed, as when making natural spaces as rich and diverse as possible tended to prevent diseases from spreading too fast, or when predators could regulate populations of potentially disease-carrying prey. But other times, relationships were more complicated.
In the end, it simply proved easier to help wild animals than to kill them. This was especially true for numerous, fast-reproducing creatures such as rodents, some of whom had already survived centuries of persecution. Before COVID-19, scientists had largely eradicated rabies in wild animals across western Europe and North America by distributing vaccine baits at landscape scales; this little-heralded but remarkable success became a model.
Preventing and treating wildlife outbreaks became a routine part of conservation. When a paramyxovirus belonging to the same viral family as measles emerged in saiga antelope in Mongolia in the year 2031, it threatened to kill the entire population—just as it nearly had in 2015. But it was promptly contained.
Much less complicated was the shutdown of markets where wild animals were bought and sold for human consumption. It was in such a market that SARS-CoV-2 likely infected its first human; in the aftermath of the ensuing pandemic, almost no public support for these markets remained. The same applied to international trade in wild animals, both legal and illegal, which had previously moved billions of creatures around Earth without consideration for disease. No longer could someone visit a pet shop at a mall in Arizona to buy a snake captured in Indonesia; no more did shipping inspectors lack the personnel to scan every border-crossing container. As for people who relied on wild animals for food and income, development programs helped them transition to other, less dangerous ways of living.
Then came the hard part. It was relatively easy to tell Asians and Africans not to eat bats, but far more difficult for Westerners to order the veggie alternative to a pulled-pork sandwich. At first, factory farming—the practice of raising tens of thousands of short-lived, highly stressed animals in close confinement—continued unabated. At the time of COVID-19, the vast majority of the billions of farmed animals consumed in North America and Europe, regions with some of the highest per capita meat consumption in world history, were housed in this way. The methods became widely adopted in the so-called developing world.
The factories were basically a network of giant zoonotic petri dishes and time bombs. Diseases evolved rapidly and spread widely in these cramped, miserable creatures. Most of the antibiotics used in the early twenty-first century went to factory animals—which in turn accelerated the evolution of treatment-resistant pathogens.
©Jo-Anne McArthur/We Animals
Scientists didn’t hesitate to shut down wildlife markets, but when it came to factory farming, they settled for risk mitigation.
Scientific literature overflowed with descriptions of extra-virulent new diseases spreading through the factories; they pointed to dozens of influenza strains evolving into forms resembling so-called highly pathogenic strains, which had human mortality rates above 50 percent. Yet the response—even among experts—was strangely muted. Scientists didn’t hesitate to shut down wildlife markets, but when it came to factory farming, they settled for risk mitigation: developing new vaccines, trying to stop dust and air from escaping buildings, monitoring workers for disease, and maybe promising them health insurance. Such measures reduced the likelihood of outbreaks but did little to change the conditions that made them inevitable.
After COVID-19, critics of factory farming did gain some traction. More public and private funds went to research on alternative sources of protein. Increasing numbers of people switched to plant-based or synthetic meats, or at least tried to eat animals who were raised in relatively humane, less disease-prone conditions. The meat industry’s influence, however, was just too great. Not only was factory-farmed meat cheap and tasty, it was normal—with the vast social inertia that this entails.
In 2032, all that changed with alarming speed. Three days after visiting an agricultural fair in Iowa, several dozen attendees reported to local hospitals with high fevers and respiratory distress; it proved to be a new avian influenza strain, both extremely contagious and, unlike COVID-19, extremely deadly, killing nearly half the people who contracted it. Thanks to strong public-health and outbreak-response networks, the so-called Iowa flu was contained after killing nearly 20,000 Americans in just two weeks; however, the apocalyptic lockdown scenes and overnight international financial panic shook the world out of its complacency.
Finally, calls to replace factory farms altogether found purchase. Government subsidies evaporated; regulators no longer turned a blind eye to their enormous environmental footprint; the financial sector started treating them as toxic. Regular citizens looked at supermarket meat aisles and fast-food value meals as pandemic lottery tickets. Eating those foods carried a social stigma, not unlike how Westerners a decade earlier had regarded bat-eating. People likened factory farms to the late-twentieth-century cigarette industry, only worse—because cigarettes, as toxic as they were, had not threatened civilization itself.
The dietary transition proved surprisingly easy. Earlier investments in alternative protein sources paid off. After a year of supermarket disarray, most people hardly noticed the difference. Livestock-industry workers were of course threatened by the change, but programs previously established to help them transition away from factory farming scaled up quickly, with some workers switching to plant production and others to raising many fewer animals in far better conditions.
This meat was not entirely risk-free. Regenerative and agro-ecological farms still required humans to mingle with animals both domestic and wild. The risk was far more manageable, though, and while meat didn’t disappear, it became something of an indulgence. Global per-capita meat and dairy consumption rates stabilized at about 10 percent of what was once typical in high-income countries—a figure produced not by dietary health guidelines but by the mathematics of land use. Safe, well-treated animals required far more space than before. If people wanted to continue eating meat at early twenty-first-century North American rates, most of Earth’s surface would be turned into farms and ranches.
Progress didn’t stop with farms. Changing diets seemed to open a sense of possibility; there was a new public appetite for confronting habitat fragmentation and destruction. Prior to COVID-19, scientists warned that these practices not only put people and wild animals in risky proximity, but also promoted the spread of disease among those animals. After the pandemic, calls for zoonosis-aware economic development were legion. Countries and companies promised to do something about it, just as they had with regard to biodiversity loss and climate change—which is to say, they did very little, because it meant forgoing short-term profits.
Then things changed. Suddenly, a new road or golf course or industrial complex was perceived in terms of zoonotic risk. Often, they were not built at all—but when they were, building costs included payments for clinics and disease surveillance, along with zoonotic insurance premiums to cover outbreak costs.
More people than ever agreed that economic growth should not be pursued without deep ecological awareness and love of nature; ideas about structuring societies along ecological principles were no longer relegated to cultural fringes. Proponents of so-called circular economies that decoupled economic growth from resource consumption, as well as advocates of replacing old-fashioned metrics of economic growth with direct measures of human well-being, now dominated discussions.
More than two decades after COVID-19 illuminated humanity’s dependence on a web of living relationships, people started acting like it—in most places, anyway. As in the after-math of that pandemic, responses varied between different types of governments. Change came slowest to the authoritarian, illiberal governments run by the heirs to Jair Bolsonaro and Donald Trump. Preparing for new diseases demanded respect for scientific expertise, public health, and a free press—all anathema to illiberals, as are restrictions on exploiting non-human life.
Governments such as these are powerful, and for a time it seemed like the world might be stratified according to literal political ecologies. Diseases, however, tend to expose truths. Just as COVID-19 exposed the glaring flaws in American society, so did the emergence of new diseases illuminate this new world order. Illiberal countries literally sickened; progressive, democratic governments flourished. They were best prepared and best equipped to deal with zoonotic disease. They had the healthiest citizens and the healthiest economies. Democracy prevailed over disease.
Brandon Keim is a freelance journalist specializing in animals, nature, and science. He is the author of The Eye of the Sandpiper: Stories from the Living World.
What to Read Next
If you remove frogs and other "mosquito-reducers" from the landscape, what happens to malaria rates?
The answer, researchers contend, depends on many factors—key among them, the price of carbon
Researchers found that stands that had shifted to deciduous dominance had a net increase in carbon storage by a factor of five over the disturbance cycle