How Humanity Gave Itself an Extra Life — Abridged

Edison Yi
16 min readApr 27, 2021

*Unless specified otherwise, the following are excerpts from NYT’s How Humanity Gave Itself an Extra Life By Steven Johnson.

Between 1920 and 2020, the average human life span doubled. How did we do it? Science mattered — but so did activism.

In effect, during the century since the end of the Great Influenza outbreak, the average human life span has doubled. There are few measures of human progress more astonishing than this. If you were to publish a newspaper that came out just once a century, the banner headline surely would — or should — be the declaration of this incredible feat. But of course, the story of our extra life span almost never appears on the front page of our actual daily newspapers, because the drama and heroism that have given us those additional years are far more evident in hindsight than they are in the moment. That is, the story of our extra life is a story of progress in its usual form: brilliant ideas and collaborations unfolding far from the spotlight of public attention, setting in motion incremental improvements that take decades to display their true magnitude.

How did this great doubling of the human life span happen? When the history textbooks do touch on the subject of improving health, they often nod to three critical breakthroughs, all of them presented as triumphs of the scientific method: vaccines, germ theory and antibiotics. But the real story is far more complicated. Those breakthroughs might have been initiated by scientists, but it took the work of activists and public intellectuals and legal reformers to bring their benefits to everyday people. From this perspective, the doubling of human life span is an achievement that is closer to something like universal suffrage or the abolition of slavery: progress that required new social movements, new forms of persuasion and new kinds of public institutions to take root. And it required lifestyle changes that ran throughout all echelons of society: washing hands, quitting smoking, getting vaccinated, wearing masks during a pandemic.

In May 1858, a progressive journalist in New York named Frank Leslie published a 5,000-word exposé denouncing a brutal killer in the metropolis. Malevolent figures, Leslie wrote, were responsible for what he called “the wholesale slaughter of the innocents.” He went on, “For the midnight assassin, we have the rope and the gallows; for the robber the penitentiary; but for those who murder our children by the thousands we have neither reprobation nor punishment.” Leslie was railing not against mobsters or drug peddlers but rather a more surprising nemesis: milk.

In an age without refrigeration, milk would spoil in summer months if it was brought in from far-flung pastures in New Jersey or upstate New York. Increased participation from women in the industrial labor force meant that more infants and young children were drinking cow’s milk, even though a significant portion of dairy cows suffered from bovine tuberculosis, and unprocessed milk from these cows could transmit the bacterium that causes the disease to human beings. Other potentially fatal illnesses were also linked to milk, including diphtheria, typhoid and scarlet fever.

By 1865, Pasteur, now a professor at the École Normal Supérieure in Paris, had hit upon the technique that would ultimately bear his name: By heating wine to around 130 degrees Fahrenheit and then quickly cooling it, he could kill many of the bacteria within, and in doing so prevent the wine from spoiling without substantially affecting its flavor. And it is that technique, applied to milk all around the world, that now saves countless people from dying of disease every single day.

Understanding that last achievement as a triumph of chemistry is not so much wrong as it is incomplete. One simple measure of why it is incomplete is how long it took for pasteurization to actually have a meaningful effect on the safety of milk: In the United States, it would not become standard practice in the milk industry until a half century after Pasteur conceived it. That’s because progress is never a result of scientific discovery alone. It also requires other forces: crusading journalism, activism, politics. Pasteurization as an idea was first developed in the mind of a chemist. But in the United States, it would finally make a difference thanks to a much wider cast of characters, most memorably a department-store impresario named Nathan Straus.

Conversations with another German immigrant, the political radical and physician Abraham Jacobi, introduced him to the pasteurization technique, which was finally being applied to milk almost a quarter of a century after Pasteur developed it. Straus saw that pasteurization offered a comparatively simple intervention that could make a meaningful difference in keeping children alive.

One major impediment to pasteurization came from milk consumers themselves. Pasteurized milk was widely considered to be less flavorful than regular milk; the process was also believed to remove the nutritious elements of milk — a belief that has re-emerged in the 21st century among “natural milk” adherents. Dairy producers resisted pasteurization not just because it added an additional cost to the production process but also because they were convinced, with good reason, that it would hurt their sales. And so Straus recognized that changing popular attitudes toward pasteurized milk was an essential step. In 1892, he created a milk laboratory where sterilized milk could be produced at scale. The next year, he began opening what he called milk depots in low-income neighborhoods around the city, which sold the milk below cost. Straus also funded a pasteurization plant on Randall’s Island that supplied sterilized milk to an orphanage there where almost half the children had perished in only three years. Nothing else in their diet or living conditions was altered other than drinking pasteurized milk. Almost immediately, the mortality rate dropped by 14 percent.

Emboldened by the results of these early interventions, Straus started an extended campaign to outlaw unpasteurized milk, an effort that was ferociously opposed by the milk industry and its representatives in statehouses around the country. Quoting an English doctor at a rally in 1907, Straus told an assembled mass of protesters, “The reckless use of raw, unpasteurized milk is little short of a national crime.” Straus’s advocacy attracted the attention of President Theodore Roosevelt, who ordered an investigation into the health benefits of pasteurization. Twenty government experts came to the resounding conclusion that pasteurization “prevents much sickness and saves many lives.” New York still wavered, and in 1909, it was instead Chicago that became the first major American city to require pasteurization. The city’s commissioner of health specifically cited the demonstrations of the “philanthropist Nathan Straus” in making the case for sterilized milk. New York finally followed suit in 1912. By the early 1920s, three decades after Straus opened his first milk depot on the Lower East Side — more than half a century after Pasteur made his namesake breakthrough — unpasteurized milk had been outlawed in almost every major American city.

The Great Escape

The fight for pasteurized milk was one of a number of mass interventions — originating in 19th-century science but not implemented at scale until the early 20th century — that triggered the first truly egalitarian rise in life expectancy.

One reason the great escape was so egalitarian in scope is that it was propelled by infrastructure advances that benefited the entire population, not just the elites. Starting in the first decades of the 20th century, human beings in cities all around the world began consuming microscopic amounts of chlorine in their drinking water.

After conducting a number of these experiments, a pioneering sanitary adviser named John Leal quietly added chlorine to the public reservoirs in Jersey City — an audacious act that got Leal sued by the city, which said he had failed to supply “pure and wholesome” water as his contract had stipulated.

After Leal’s successful experiment, city after city began implementing chlorine disinfectant systems in their waterworks: Chicago in 1912, Detroit in 1913, Cincinnati in 1918. By 1914, more than 50 percent of public-water customers were drinking disinfected water. These interventions turned out to be a lifesaver on an astonishing scale. In 1908, when Leal first started experimenting with chlorine delivery in Jersey City, typhoid was responsible for 30 deaths per 100,000 people. Three decades later, the death rate had been reduced by a factor of 10.

The rise of chlorination, like the rise of pasteurization, could be seen solely as another triumph of applied chemistry. But acting on those new ideas from chemistry — the painstaking effort of turning them into lifesaving interventions — was the work of thousands of people in professions far afield of chemistry: sanitation reformers, local health boards, waterworks engineers. Those were the men and women who quietly labored to transform America’s drinking water from one of the great killers of modern life to a safe and reliable form of hydration.

Today, of course, we think of medicine as one of the pillars of modern progress, but until quite recently, drug development was a scattershot and largely unscientific endeavor. One critical factor was the lack of any legal prohibition on selling junk medicine. In fact, in the United States, the entire pharmaceutical industry was almost entirely unregulated for the first decades of the 20th century. Technically speaking, there was an organization known as the Bureau of Chemistry, created in 1901 to oversee the industry. But this initial rendition of what ultimately became the U.S. Food and Drug Administration was toothless in terms of its ability to ensure that customers were receiving effective medical treatments. Its only responsibility was to ensure that the chemical ingredients listed on the bottle were actually present in the medicine itself. If a company wanted to put mercury or cocaine in their miracle drug, the Bureau of Chemistry had no problem with that — so long as it was mentioned on the label.

Medical drugs finally began to have a material impact on life expectancy in the middle of the 20th century, led by the most famous “magic bullet” treatment of all: penicillin. Just as in the case of Jenner and the smallpox vaccine, the story of penicillin traditionally centers on a lone genius and a moment of surprising discovery.

Like many stories of scientific breakthroughs, though, the tale of the petri dish and the open window cartoonishly simplifies and compresses the real narrative of how penicillin — and the other antibiotics that quickly followed in its wake — came to transform the world. Far from being the story of a lone genius, the triumph of penicillin is actually one of the great stories of international, multidisciplinary collaboration in the history of science. It also represents perhaps the most undersung triumph of the Allied nations during World War II. Ask most people to name a top-secret military project from that era involving an international team of brilliant scientists, and what most likely would spring to mind is the Manhattan Project. In fact, the race to produce penicillin at scale involved all the same elements — only it was a race to build a genuinely new way to keep people alive, not kill them.

…global events had turned the mold from a mere medical breakthrough into a key military asset: War had broken out, and it was clear that a miracle drug that could reduce the death rate from infections would be a major boost to the side that was first able to develop it.

It might seem strange that Florey and Heatley were set up in an agricultural lab when they were working on a medical drug. But Peoria turned out to be the perfect spot for them. The agricultural scientists had extensive experience with molds and other soil-based organisms. And the heartland location had one meaningful advantage: its proximity to corn. The mold turned out to thrive in vats of corn steep liquor, which was a waste product created by making cornstarch.

While the scientists experimented with creating larger yields in the corn steep liquors, they also suspected that there might be other strains of penicillin out in the wild that would be more amenable to rapid growth. At the same time, U.S. soldiers and sailors collected soil samples around the globe — Eastern Europe, North Africa, South America — to be shipped back to the American labs for investigation. An earlier soil search in the United States had brought back an organism that would become the basis for streptomycin, now one of the most widely used antibiotics in the world. In the years immediately after the end of the war, Pfizer and other drug companies would go on to conduct major exploratory missions seeking out soil samples everywhere, from the bottoms of mine shafts to wind-borne samples gathered with the aid of balloons. In the end Pfizer collected a staggering 135,000 distinct samples.

The search for promising molds took place closer to home as well. During the summer months of 1942, shoppers in Peoria grocery stores began to notice a strange presence in the fresh produce aisles, a young woman intently examining the fruit on display, picking out and purchasing the ones with visible rot. Her name was Mary Hunt, and she was a bacteriologist from the Peoria lab, assigned the task of locating promising molds that might replace the existing strains that were being used. (Her unusual shopping habits ultimately gave her the nickname Moldy Mary.) One of Hunt’s molds — growing in a particularly unappetizing cantaloupe — turned out to be far more productive than the original strains that Florey and Chain’s team had tested. Nearly every strain of penicillin in use today descends from the colony Hunt found in that cantaloupe.

III. The Great Equalizing

The decade following the initial mass production of antibiotics marked the most extreme moment of life-span inequality globally. In 1950, when life expectancy in India and most of Africa had barely budged from the long ceiling of around 35 years, the average American could expect to live 68 years, while Scandinavians had already crossed the 70-year threshold. But the post-colonial era that followed would be characterized by an extraordinary rate of improvement across most of the developing world. The gap between the West and the rest of the world has been narrowing for the past 50 years, at a rate unheard-of in demographic history. It took Sweden roughly 150 years to reduce childhood mortality rates from 30 percent to under 1 percent. Postwar South Korea pulled off the same feat in just 40 years. India nearly doubled life expectancy in just 70 years; many African nations have done the same, despite the ravages of the AIDS epidemic. In 1951, the life-span gap that separated China and the United States was more than 20 years; now it is just two.

The forces behind these trends are complex and multivariate. Some of them involve increasing standards of living and the decrease in famine, driven by the invention of artificial fertilizer and the “green revolution”; some of them involve imported medicines and infrastructure — antibiotics, chlorinated drinking water — that were developed earlier. But some of the most meaningful interventions came from within the Global South itself, including a remarkably simple but powerful technique called oral rehydration therapy.

The sheer magnitude of that loss was a global tragedy, but it was made even more tragic because a relatively simple treatment for severe dehydration existed, one that could be performed by nonmedical professionals outside the context of a hospital. Now known as oral rehydration therapy, or O.R.T., the treatment is almost maddeningly simple: give people lots of boiled water to drink, supplemented with sugar and salts. (Americans basically are employing O.R.T. when they consume Pedialyte to combat a stomach bug.) A few doctors in India, Iraq and the Philippines argued for the treatment in the 1950s and 1960s, but in part because it didn’t seem like “advanced” medicine, it remained a fringe idea for a frustratingly long time.

…a vicious outbreak of cholera had arisen in the crowded refugee camps outside Bangaon. A Johns Hopkins-educated physician and researcher named Dilip Mahalanabis suspended his research program in a Kolkata hospital lab and immediately went to the front lines of the outbreak. He found the victims there pressed against one another on crowded hospital floors coated in layers of watery feces and vomit.

Mahalanabis quickly realized that the existing IV protocols were not going to work. Only two members of his team were even trained to deliver IV fluids. “In order to treat these people with IV saline,” he later explained, “you literally had to kneel down in their feces and their vomit.”

Mahalanabis decided to embrace the low-tech approach. Going against standard practice, he and his team turned to an improvised version of oral rehydration therapy. He delivered it directly to the patients he had contact with, like those sprawled bodies on the floor of the Bangaon hospital. Under Mahalanabis’s supervision, more than 3,000 patients in the refugee camps received O.R.T. therapy. The strategy proved to be an astonishing success: Mortality rates dropped by an order of magnitude, to 3 percent from 30 percent, all by using a vastly simpler method of treatment.

The original advocates for vaccination, back in Edward Jenner’s age, dreamed of wiping the smallpox virus off the face of the earth. On the eve of his first term as president, Thomas Jefferson wrote about removing smallpox from “the catalog of evils.” But in the early 1800s, the fight against variola was progressing on a patient-by-patient basis. Eradicating smallpox entirely on a global scale was a technical impossibility. What moved smallpox eradication from an idle fantasy to the realm of possibility?

Starting in the mid-1960s, the W.H.O. — led by a C.D.C. official, D.A. Henderson — worked in concert with hundreds of thousands of health workers, who oversaw surveillance and vaccinations in the more than 40 countries still suffering from smallpox outbreaks. The idea of an international body that could organize the activity of so many people over such a vast geography, and over so many separate jurisdictions, would have been unthinkable at the dawn of the 19th century.

But as with chlorination and oral rehydration therapy, smallpox eradication was a triumph of bottom-up organization. Just locating smallpox outbreaks in countries as vast as India, in an age without cellphones and the internet and in many cases electricity, was a feat of staggering complexity. The ring-vaccination approach offered a more efficient use of the vaccine — as opposed to simply vaccinating the entire population — but officials still needed to find the cases to build the ring around.

In India alone, that kind of surveillance work required thousands of district health personnel, and more than a hundred thousand fieldworkers, overcoming challenging physical conditions and local resistance to do their work. And even that wasn’t a big enough labor force to track every single outbreak in the country. Eventually the eradicators decided to widen their surveillance network further, by offering a reward to anyone who reported a smallpox case. (The reward money increased steadily as the smallpox caseload dropped, ultimately reaching the equivalent of $1,000.) The wide-network approach proved to be a spectacular success. Outbreaks dropped precipitously during the last four months of 1974: 2,124 to 980 to 343 to 285. During the final stages of the project, fieldworkers would visit each of the country’s 100 million households — once a month in endemic states, once every three months throughout the rest of the country — to trace the remaining spread of the virus.

Eradication was ultimately as dependent on that wide network as much as the bifurcated needle or any other technological advance. Smallpox eradication might have been originally dreamed up in the headquarters of public-health institutions in Atlanta and Geneva, but it took an army of villagers to make it a reality.

IV. The Edge of Eight Billion

Demagogues sometimes rant about irresponsible birthrates in developing-world countries, but the truth is the spike in global population has not been caused by some worldwide surge in fertility. In fact, people are having fewer babies per capita than ever. What changed over the past two centuries, first in the industrialized world, then globally, is that people stopped dying — particularly young people. And because they didn’t die, most then lived long enough to have their own children, who repeated the cycle with their offspring. Increase the portion of the population that survives to childbearing years, and you’ll have more children, even if each individual has fewer offspring on average. Keep their parents and grandparents alive longer, and the existing population swells as the surviving generations stack up. Repeat that pattern all over the world for four or five generations, and global population can grow to eight billion from two billion, despite declining fertility rates.

Runaway population growth — and the environmental crisis it has helped produce — should remind us that continued advances in life expectancy are not inevitable. We know from our recent history during the industrial age that scientific and technological progress alone do not guarantee positive trends in human health. Perhaps our increasingly interconnected world — and dependence on industrial livestock, particularly chickens — may lead us into what some have called an age of pandemics, in which Covid-19 is only a preview of even more deadly avian-flu outbreaks. Perhaps some rogue technology — nuclear weapons, bioterror attacks — will kill enough people to reverse the great escape. Or perhaps it will be the environmental impact of 10 billion people living in industrial societies that will send us backward. Extending our lives helped give us the climate crisis. Perhaps the climate crisis will ultimately trigger a reversion to the mean.

No place on earth embodies that complicated reality more poignantly than Bhola Island, Bangladesh. Almost half a century ago, it was the site of one of our proudest moments as a species: the elimination of variola major, realizing the dream that Jenner and Jefferson had almost two centuries before. But in the years that followed smallpox eradication, the island was subjected to a series of devastating floods; almost half a million people have been displaced from the region since Rahima Banu contracted smallpox there. Today large stretches of Bhola Island have been permanently lost to the rising sea waters caused by climate change. The entire island may have disappeared from the map of the world by the time our children and grandchildren celebrate the centennial of smallpox eradication in 2079.

What will their life spans look like then? Will the forces that drove so much positive change over the past century continue to propel the great escape? Will smallpox turn out to be just the first in a long line of threats — polio, malaria, influenza — removed from Jefferson’s “catalog of evils”? Will the figurative rising tide of egalitarian public health continue to lift all the boats? Or will those momentous achievements — all that unexpected life — be washed away by an actual tide?

My thoughts:

This article raises a pertinent question of whose names are written into history books and whose contributions are overlooked. In our pursuit of stories about individual heroism and ingenuity, it is all too easy to overlook. Time and time again, our first reaction to innovations is to resist them, whether out of habit or because of entrenched interests. Without wider social efforts, life-saving innovations might sit in labs without ever seeing the light of day, a point made all too clear by anti-vaccination groups.

--

--

Edison Yi

This blog contains a collection of satires, notes, and essays on philosophy, economics, etc. I’m a master’s student in Philosophy at Oxford.