Itemoids

Queensland

Victorian Science’s Great Unsolved Murder Mystery

The Atlantic

feedproxy.google.com › ~r › TheAtlantic › ~3 › MlH6yIrb_CE

In the summer of 1893, an unusual volume appeared on the shelves of London booksellers. The Great Barrier Reef of Australia: Its Products and Potentalities, published by W. H. Allen and Company, was remarkable both for its price—the leather-bound volume would have cost a skilled tradesperson nearly two weeks’ pay—and for its fantastically close observation of the world’s largest reef system.

Many British readers knew of the existence of coral reefs, from the accounts of Charles Darwin and other naturalist-explorers. They might have known that James Cook and his crew had been trapped and nearly died in the labyrinthine “shoals” off the eastern coast of what would become Britain’s most distant colonies. But far fewer grasped that coral reefs were living systems composed of countless tiny, soft-bodied animals; even fewer had any real sense of the squirming, kaleidoscopic grandeur of the Great Barrier Reef. For most of its readers, The Great Barrier Reef of Australia revealed an almost completely unknown world.

At a time when photography was cumbersome and expensive and color photography was little more than a curiosity, the author William Saville-Kent had waded into the Pacific at low tide and, with the help of a specially constructed four-legged stand, had worked out a method of photographing coral colonies from above. The resulting large-format prints were exceptionally clear, and Saville-Kent’s accompanying watercolor sketches suggested the polychromatic glory of a flourishing reef: Pale-violet stony corals, orange sea stars, and crowds of colorful reef fish burst from the pages in pre-industrial abundance.

William Saville-Kent, 1892. (Wikimedia)

To some of the readers who admired Saville-Kent’s work in 1893, his name might have sounded familiar. He wasn’t a famous naturalist. Was he famous for something else? No matter; while the author had recently returned home to England for a frenzied, year-long bout of specimen identification and writing, he had long since sailed back to Australia.

Thirty-three years before the publication of The Great Barrier Reef of Australia, the mutilated body of a young boy named Saville Kent was discovered on the grounds of an English country house. Three-year-old Saville lived with his family in the village of Rode (then Road), about a hundred miles west of London, and had gone missing from his bedroom in the predawn hours of June 30, 1860. After several hours of frantic searching by his parents, his four older stepsiblings, the household staff, and several neighbors, Saville’s body was found hidden in the servants’ outhouse. His throat had been cut so deeply that his neck was nearly severed. When the local police arrived and searched the tank below the outhouse, they discovered a “bosom flannel”—a cloth worn inside the front of a corset—that had been recently stained with blood.

[Read: Since 2016, half of all coral in the Great Barrier Reef has died]

The contrast between the grisly crime and its genteel setting shocked the country, and the resulting fear and outrage were soon followed by frustration with the local police. On July 10, an editorial in the national Morning Post called for an experienced detective to take over the investigation, arguing that “the security of families” depended on the killer’s being brought to justice. Within a week, the commissioner of London’s Metropolitan Police had dispatched Detective Inspector Jack Whicher to Rode.

Whicher, then 45, was a member of the Metropolitan Police’s small detective division, created 18 years earlier to investigate serious crimes that crossed precinct lines. The police force itself was not much older, and the detective division, whose plainclothes officers often worked undercover, was widely seen as an unwelcome escalation of state surveillance.

At the same time, the detectives and their rumored powers of observation fired the public imagination. The author Kate Summerscale, in her 2008 book about the murder of Saville Kent, writes that by 1860, Whicher and his colleagues “had become figures of mystery and glamour, the surreptitious, all-seeing little gods of London.” Charles Dickens praised their uncanny abilities, and he described one of his fictional characters, based on Whicher, as having “a reserved and thoughtful air, as if he were engaged in deep arithmetical calculations.” When Whicher arrived in Rode, he was generally expected to not only unmask the killer of Saville Kent but also repair the violated sanctity of the English home.

At first, Whicher’s investigation revealed only the unhappiness in one particular English home, which was occupied by Saville’s father, Samuel; his four children from his first marriage; his second wife; and their three—now two—younger children. Samuel, who worked for the government as a factory inspector, had moved the family to Rode five years earlier, and had quickly made himself unpopular in the village by forbidding fishing in the river near his house. His first wife, Mary Ann, had died in 1852 after enduring years of mental and physical illness and the deaths of several children in infancy. Her symptoms—and those of her surviving children—have led historians to theorize that Samuel had infected her with syphilis, and that she suffered from an advanced form of the disease. Fifteen months after Mary Ann’s death, Samuel married the family governess, Mary Pratt.

The second Mrs. Kent was said to be impatient with her two younger stepchildren, Constance and William, and in July 1856, when they were 12 and 11, the pair had run away from home. In the same outhouse where Saville’s body was discovered, Constance had changed into a set of William’s clothes, cut off her hair, and stuffed her dress and petticoats into the tank. The two had set out on foot for the coast, planning to sign on to a ship’s crew as cabin boys; they traveled about 10 miles before an innkeeper, suspecting they were runaways, reported them to the police. While press accounts of the incident varied, most of them cast Constance as the instigator. Decades later, Constance herself recalled that she had remained defiant when apprehended, leading her questioners to treat her “as a bad boy who had led the other astray.”

While the suspicions of the local police had focused on Saville’s nursemaid, Elizabeth Gough, Whicher was more interested in Constance. Interviews with her schoolmates and the household staff persuaded him that the 16-year-old had been consumed with jealousy toward her father’s new family, and toward her young stepbrother in particular. He also learned that one of the three nightdresses she owned was unaccounted for, and while she claimed it had been lost in the wash, he suspected she had been wearing it on the night of the murder and had later hidden or destroyed it. On July 20, Whicher arrested Constance, charging her with Saville’s murder.

The detective’s careful observations, however, were not enough to make his charge stick—or extract a confession from his suspect. Constance insisted on her innocence, and after days of searching at Whicher’s behest, the missing nightgown remained missing. After a week in jail, Constance was examined before a sympathetic audience by the local magistrates, who apparently agreed with her lawyer’s claim that “there is not one tittle of evidence against this young lady.” Constance was released on July 27, with the stipulation that she remain available for further questioning. The next day, a defeated Whicher took the train back to London.

The press condemned the detective for his failure, but he continued to believe that Constance had killed Saville, perhaps with the knowledge or assistance of her brother William. He was further convinced of his theory in November, when it emerged that immediately after the murder, a local police officer had found a bloodied woman’s gown shoved into the kitchen stove. “After all that has been said in reference to this case,” Whicher wrote to a colleague, “there is in my humble judgement but one solution to it.”

Few believed him, and the reputation of Whicher’s profession suffered along with his own. In the wake of the Rode case, detectives were no longer “all-seeing little gods” but all-too-fallible mortals. As Summerscale notes, the word clueless came into use in 1862, and in 1863, the satirical magazine Punch skewered “Inspector Watcher” of the “Defective Police.” The following year, Whicher took an early retirement at age 49, citing “congestion of the brain.”

On April 25, 1865, 21-year-old Constance Kent confessed to killing Saville.

In the years since the murder of Saville Kent, most of the Kent household had moved to north Wales. Constance had spent two years at a finishing school in France, and in the summer of 1863 had returned to England, boarding at an Anglican religious home in Brighton run by Reverend Arthur Wagner. Wagner was a controversial figure, known for his support of Roman Catholic practices such as private confession, and it was during an interview with Wagner that Constance first unburdened herself.

In her formal confession, submitted to a London magistrates’ court, Constance stated that she had killed Saville “alone and unaided,” and that no one had known of her actions. Shortly afterward, she wrote that she had experienced only “the greatest kindness” from both her father and her stepmother, and had killed Saville not out of jealousy but in revenge for what she described as her stepmother’s cruel treatment of her mother during her mother’s years of illness. “She had robbed my mother of the affection which was her due,” Constance wrote, “so I would rob her of what she most loved.”

The confession divided public opinion: While some took Constance’s words at face value, others believed she had been manipulated by Wagner, or was protecting the real culprit. Constance maintained her guilt, however, and on July 21, 1865, after a 20-minute trial, she was convicted and sentenced to death—a fate later commuted to 20 years in prison.

During Constance’s first year of incarceration, her brother William turned 21 and moved to London with his two eldest stepsisters. With the help of his mother’s family, he began taking evening classes, apparently preparing for a career in the civil service. William was far more interested in flowers and insects than bureaucracy, however, and whenever he could he attended the public lectures offered by the era’s leading naturalists.

During the years that the Kent family was undergoing its private and public horrors, the scientific world had been upended. Darwin’s theory of evolution by natural selection, described in his 1859 book On the Origin of Species, had challenged the long-standing assumptions that species were static entities and that the human species was set apart, superior to all the others. (Late in life, Constance would recall that she read the book shortly after its publication and shocked her family by endorsing its theory.)

When William arrived in London, Darwin’s work was still enormously controversial, but it had benefited from influential champions such as the zoologist T. H. Huxley, remembered as “Darwin’s bulldog” for his fierce defense of evolution. Huxley was known for his compelling lectures, and William later wrote that they were “the starting point” for his own life in science. For William, science might have also represented a chance to break with his past, for around this time he began to use the hyphenated surname Saville-Kent. Saville, which was his own middle name as well as the name of his late stepbrother, was also the family name of his paternal grandmother; he used it for the rest of his life.

Despite the immense public interest in natural history at the time, the study of other species was not yet an established profession, and its practitioners tended to be independently wealthy. Huxley, the son of a schoolteacher, had hammered together a career with his wits and forceful charm, and most of his protégées, including Saville-Kent, were expected to do the same.

[Read: Mysterious rings around reefs have no simple explanation]

Saville-Kent soon developed an interest in microscopy—the ability to study previously invisible life forms was, he recalled, like exploring “a new world”—and with Huxley’s encouragement he began to investigate aquatic invertebrates. As a museum assistant at the Royal College of Surgeons, Saville-Kent became “smitten” with the corals whose calcium-carbonate skeletons he studied. Later, as an assistant at the British Museum, he sailed to Portugal to investigate the glass rope sponge, a species whose clear tissues so puzzled naturalists that some believed it was actually made of glass.

But as Saville-Kent’s biographer, Anthony J. Harrison, recounts, museum work paid poorly, and the young scientist was not only newly married but, after his father’s death in 1872, at least partly responsible for his four surviving stepsiblings. Over the next several years, Saville-Kent held a succession of research positions at commercial aquariums, which were newly popular as entertainment (thanks in part to Jules Verne’s novel Twenty Thousand Leagues Under the Sea). In Manchester, he studied the life cycle of lobsters and developed a method of keeping large species of seaweed alive in captivity; in Brighton, he clashed with a colleague over credit for a study of octopus sex.

What Saville-Kent did not do during these years, apparently, was communicate regularly with Constance. The historian Noeline Kyle, who examined Constance’s prison records, found that her brother wrote to her only twice as she was moved from prison to prison over the course of two decades. “Will you be kind enough to write to my brother and beg & entreat him to come and see me,” Constance wrote to an acquaintance in 1881. Her pleas had no documented effect.

In 1884, Huxley recommended Saville-Kent for a job as the inspector of fisheries for the former penal colony of Tasmania, and in May of that year Saville-Kent left England with his wife, Mary Ann, and his half-sister Mary Amelia. Though Saville-Kent was welcomed by colonial officials, he almost immediately butted heads with a local fisheries commission, whose members were preoccupied with the introduction of Atlantic salmon to the Pacific island. When his contract expired after three tumultuous years, Saville-Kent began hiring out his expertise to the Australian mainland, advising its colonial governments on the management of their mostly unregulated and fast-diminishing fisheries.

By this time, most of Saville-Kent’s immediate family had followed him to Australia. His half-siblings Acland, Florence, and Eveline all moved to the colonies during the 1880s. And in July 1885, after 20 years in prison, Constance Kent was released into the care of Reverend Wagner. Several months later, she seems to have emigrated to Australia alone. Some historians speculate that she reconnected with her brother, even living and traveling with him, while others believe that the two siblings had little or no contact. What is certain is that any secrets they shared remained secret.

Though Saville-Kent had been fascinated by corals since the beginning of his career, he didn’t see a living tropical reef until 1888, when he joined a surveying cruise along the northwestern coast of Australia. Few foreign naturalists had visited the area, and Saville-Kent, overwhelmed by the diversity of the tropics, immersed himself in collecting, painting, and drawing the vivid life forms he encountered. Shortly afterward, the colonial government of Queensland appointed him as its commissioner of fisheries, and he and Mary Ann moved to Brisbane. There, he surveyed the colony’s fisheries, helped reform their management, and began the work that would lead to The Great Barrier Reef of Australia.

When James Cook and the crew of HMS Endeavour had collided with the Great Barrier Reef more than a century earlier, they had also collided with some of the only humans who knew it well: The Guugu Yimithirr people did not take kindly to the crew’s appetite for green turtles, and they drove home their disapproval with well-aimed spears and boomerangs. As the British empire colonized Australia over the following decades, the reef became a colonial “possession” that was barely understood by its supposed possessors.

[Read: How coral researchers are coping with the death of reefs]

For most Europeans, after all, submarines and diving equipment were still fantasies from the imagination of Jules Verne, and their closest encounters with undersea life were in public aquariums like those that had once hired Saville-Kent. The ocean in general, and the vast Pacific in particular, provoked far more fascination and fear than protective concern; when the Saville-Kents arrived in Brisbane, colonists on the Queensland coast were reporting sightings of a giant shell-backed sea serpent they called the Moha-Moha. And while European naturalists had come to understand that coral skeletons were built and inhabited by invertebrate animals called polyps, the biology of corals was still largely unknown, and they hovered unsettlingly between plant and animal, animate and inanimate.

Saville-Kent, undaunted by the Moha-Moha, set out to probe the mysteries of the Great Barrier Reef. Dressed in a dark coat with a stiff collar, and with a cork helmet clapped on his head for protection against the tropical sun, he rolled up his trousers and waded into the surf, drawing and painting the coral outcrops he saw through the shallow water. When Mary Ann gave him a camera as a gift, he adapted a tripod into a frame that allowed him to photograph underwater corals from above, and he experimented with different lenses and filters in order to capture the sharpest possible images. When low tide arrived before dawn, he sometimes ventured into the ocean in the dark, photographing coral outcrops before the water rose. He toted his equipment up and down the coast, traveling from Brisbane to Thursday Island, part of the archipelago at the northernmost tip of Queensland. He made an especially close study of the coral genus Madrepore, identifying more than 70 species previously unknown to science. Everywhere he went, he noted the remarkable variety of life that composed the then-thriving reef.

Saville-Kent’s photographs were clear enough to serve as a benchmark for coral growth—a gift to not only naturalists but also navigators, who were eager to protect their ships from unmapped coral outcrops—and they were, perhaps, his greatest accomplishment. When The Great Barrier Reef of Australia was published in 1893, a review in Nature singled out “the diligence and skill of the author in photography” and predicted that the “magnificently illustrated” volume would “go far towards giving a realistic impression of some of the beauties of coral seas to the untravelled.” As Darwin had reflected during his own Pacific travels decades earlier, coral reefs “rank high amongst the wonderful objects of this world.” Despite the obstacles of distance and imperfect technology, Saville-Kent managed to share that wonder with many who would never experience it firsthand.

Illustrations from The Great Barrier Reef of Australia, 1893. (Wikimedia)

Two years after the publication of The Great Barrier Reef of Australia, the Saville-Kents moved back to England, but William continued to travel between England and Queensland until the fall of 1908, when he died suddenly at the age of 63. He was buried in a churchyard in the English village of Milford on Sea, just a few dozen miles from Rode, and his gravestone was decorated with coral skeletons from the Great Barrier Reef. In an obituary, the editors of Nature suggested that “Mr. Saville-Kent will perhaps be best remembered by his sumptuous work on the Great Barrier Reef of Australia.” They did not refer to the notorious crime that had shadowed his life.

While Saville-Kent succeeded in distancing himself from the murder of his young stepbrother, the case and its unanswered questions were not forgotten. The first of at least six full-length books about the murder was published by a friend of Samuel’s in 1861, and a stream of pamphlets and essays rehashed the evidence. In 1929, the London publishers of The Case of Constance Kent, which had been written under a pseudonym by the detective novelist John Street, received a lengthy letter from Australia. The letter writer reported that Constance had died, but that before her death she had related the details of her early life. The writer insisted that Constance’s mother had not been insane, as Street had claimed, and that despite Street’s skepticism, Constance’s confession of her “most callous and brutal crime” had been genuine.

Street suspected that Constance herself had written the letter, but not until the 1980s would another author, Bernard Taylor, confirm that Constance, living in Australia under the pseudonym Ruth Emilie Kaye, had trained as a nurse and become a respected hospital administrator, even spending several years in charge of a ward for patients suffering from Hansen’s disease, also known as leprosy. She died in 1944, shortly after her 100th birthday, and her obituaries, like her brother William’s, made no mention of the murder of Saville Kent.

The exact circumstances of Saville Kent’s death are no clearer now than they were to Detective Inspector Jack Whicher in the summer of 1860. Noeline Kyle, in her 2009 book about Constance’s post-prison life, concludes that whether or not Constance actually slit her young stepbrother’s throat, she came up with the idea—much as she came up with the plan for her and William to run away to sea—and therefore held herself responsible for his murder. But only the perpetrator, or perpetrators, knew the full truth.

The Victorian-era frustration with the Kent case found its most lasting expression in fiction. In the 1868 novel The Moonstone, one of the earliest modern detective novels, the crime, like that in the Kent case, takes place in an English country house, and the residents of the house are the primary suspects. The botched initial investigation is taken over by a taciturn detective from London, Sergeant Cuff, who unearths family secrets and focuses his attention on a missing nightgown. Initially, Cuff suspects the wrong person. But he eventually succeeds where Whicher failed, definitively identifying the guilty and providing a full account of motive and method. By doing so, he helps repair a shattered household.

The novel was enormously popular—it remains a satisfying read—and many of its ingredients are now familiar tropes. From Sherlock Holmes to Jessica Jones, fictional detectives can be relied on to wade into the messy aftermath of violence, spot the relevant evidence, and expose the perpetrators. In a way, the entire genre is a multivolume fix-it of the Kent case: Over and over, its heroes coax secrets to the surface, the certainty of their conclusions restoring something like order to the universe. Reality, of course, is rarely so reassuring.

Saville-Kent might have managed to keep one secret, but he was determined to share another. His innovative photographs of the Great Barrier Reef represented what the art historian Ann Elias calls an “emerging compulsion” among scientists and photographers to document living coral, and his successors would go to even greater lengths to expose reefs to a global audience.

In the 1910s and ’20s, the American explorer Ernest Williamson photographed Caribbean reefs at a depth of 150 feet from his “photosphere,” a chamber fitted with a funnel-shaped glass window and tethered to the surface with an air hose. The Australian photographer Frank Hurley, who in the 1920s followed Saville-Kent’s footsteps along the Queensland coast, built a surf-side aquarium in which he reconstructed underwater scenes in order to film and photograph them. By the mid-20th century, thanks to advances in diving technology, the curious could see living reefs firsthand, and underwater cameras allowed filmmakers to capture reefs in vibrant moving color. While the makers of today’s high-definition deep-sea documentaries are equipped with technology, and budgets, that Saville-Kent could only dream of, they are still surfacing the secrets of the ocean. In some ways, they have succeeded: While the ocean is still a place of mystery, it is better understood and less feared than it used to be.

But here, too, observation has its limits. Saville-Kent’s carefully framed photographs didn’t document the ravages of colonialism, or the industrial greenhouse-gas emissions that were already accumulating in the atmosphere. Hurley didn’t recognize that the coral polyps confined in his aquarium, stressed by the sun-heated water, were expelling their symbiotic algae, sacrificing both their color and their food supply in a process we now call coral bleaching. Neither could have imagined that in the early years of the 21st century, the ocean would grow warm enough to regularly bleach huge swaths of the Great Barrier Reef. Even today, the slow violence of climate change is often invisible, and resistant to scrutiny. But the identity of its perpetrators is no secret at all.

Individualism Is Still Sabotaging the Pandemic Response

The Atlantic

feedproxy.google.com › ~r › TheAtlantic › ~3 › eMCpgErHuh8

During a pandemic, no one’s health is fully in their own hands. No field should understand that more deeply than public health, a discipline distinct from medicine. Whereas doctors and nurses treat sick individuals in front of them, public-health practitioners work to prevent sickness in entire populations. They are expected to think big. They know that infectious diseases are always collective problems because they are infectious. An individual’s choices can ripple outward to affect cities, countries, and continents; one sick person can seed a hemisphere’s worth of cases. In turn, each person’s odds of falling ill depend on the choices of everyone around them—and on societal factors, such as poverty and discrimination, that lie beyond their control.

Across 15 agonizing months, the COVID-19 pandemic repeatedly confirmed these central concepts. Many essential workers, who held hourly-wage jobs with no paid sick leave, were unable to isolate themselves for fear of losing their livelihood. Prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. Black and Latino communities that were underserved by the existing health system were disproportionately infected and killed by the new coronavirus, and now have among the lowest vaccination rates in the country.

Perhaps that’s why so many public-health experts were disquieted when, on May 13, the CDC announced that fully vaccinated Americans no longer needed to wear masks in most indoor places. “The move today was really to talk about individuals and what individuals are safe doing,” Rochelle Walensky, the agency’s director, told PBS NewsHour. “We really want to empower people to take this responsibility into their own hands.” Walensky later used similar language on Twitter: “Your health is in your hands,” she wrote.

Framing one’s health as a matter of personal choice “is fundamentally against the very notion of public health,” Aparna Nair, a historian and anthropologist of public health at the University of Oklahoma, told me. “For that to come from one of the most powerful voices in public health today … I was taken aback.” (The CDC did not respond to a request for comment.) It was especially surprising coming from a new administration. Donald Trump was a manifestation of America’s id—an unempathetic narcissist who talked about dominating the virus through personal strength while leaving states and citizens to fend for themselves. Joe Biden, by contrast, took COVID-19 seriously from the off, committed to ensuring an equitable pandemic response, and promised to invest $7.4 billion in strengthening America’s chronically underfunded public-health workforce. And yet, the same peal of individualism that rang in his predecessor’s words still echoes in his. “The rule is very simple: Get vaccinated or wear a mask until you do,” Biden said after the CDC announced its new guidance. “The choice is yours.”

From its founding, the United States has cultivated a national mythos around the capacity of individuals to pull themselves up by their bootstraps, ostensibly by their own merits. This particular strain of individualism, which valorizes independence and prizes personal freedom, transcends administrations. It has also repeatedly hamstrung America’s pandemic response. It explains why the U.S. focused so intensely on preserving its hospital capacity instead of on measures that would have saved people from even needing a hospital. It explains why so many Americans refused to act for the collective good, whether by masking up or isolating themselves. And it explains why the CDC, despite being the nation’s top public-health agency, issued guidelines that focused on the freedoms that vaccinated people might enjoy. The move signaled to people with the newfound privilege of immunity that they were liberated from the pandemic’s collective problem. It also hinted to those who were still vulnerable that their challenges are now theirs alone and, worse still, that their lingering risk was somehow their fault. (“If you’re not vaccinated, that, again, is taking your responsibility for your own health into your own hands,” Walensky said.)

Neither is true. About half of Americans have yet to receive a single vaccine dose; for many of them, lack of access, not hesitancy, is the problem. The pandemic, meanwhile, is still just that—a pandemic, which is raging furiously around much of the world, and which still threatens large swaths of highly vaccinated countries, including some of their most vulnerable citizens. It is still a collective problem, whether or not Americans are willing to treat it as such.

Individualism can be costly in a pandemic. It represents one end of a cultural spectrum with collectivism at the other—independence versus interdependence, “me first” versus “we first.” These qualities can be measured by surveying attitudes in a particular community, or by assessing factors such as the proportion of people who live, work, or commute alone. Two studies found that more strongly individualistic countries tended to rack up more COVID-19 cases and deaths. A third suggested that more individualistic people (from the U.S., U.K, and other nations) were less likely to practice social distancing. A fourth showed that mask wearing was more common in more collectivist countries, U.S. states, and U.S. counties—a trend that held after accounting for factors including political affiliation, wealth, and the pandemic’s severity. These correlative studies all have limitations, but across them, a consistent pattern emerges—one supported by a closer look at the U.S. response.

“From the very beginning, I’ve thought that the way we’ve dealt with the pandemic reflects our narrow focus on the individual,” Camara Jones, a social epidemiologist at Morehouse School of Medicine, told me. Testing, for instance, relied on slow PCR-based tests to diagnose COVID-19 in individual patients. This approach makes intuitive sense—if you’re sick, you need to know why—but it cannot address the problem of “where the virus actually is in the population, and how to stop it,” Jones said. Instead, the U.S. could have widely distributed rapid antigen tests so that people could regularly screen themselves irrespective of symptoms, catch infections early, and isolate themselves when they were still contagious. Several sports leagues successfully used rapid tests in exactly this way, but they were never broadly deployed, despite months of pleading from experts.

The U.S. also largely ignored other measures that could have protected entire communities, such as better ventilation, high-filtration masks for essential workers, free accommodation for people who needed to isolate themselves, and sick-pay policies. As the country focused single-mindedly on a vaccine endgame, and Operation Warp Speed sped ahead, collective protections were left in the dust. And as vaccines were developed, the primary measure of their success was whether they prevented symptomatic disease in individuals.

Vaccines, of course, can be a collective solution to infectious disease, especially if enough people are immune that outbreaks end on their own. And even if the U.S. does not achieve herd immunity, vaccines will offer a measure of collective protection. As well as preventing infections—severe and mild, symptomatic and asymptomatic, vanilla and variant—they also clearly make people less likely to spread the virus to one another. In the rare event that fully vaccinated people get breakthrough infections, these tend to be milder and shorter (as recently seen among the New York Yankees); they also involve lower viral loads. “The available evidence strongly suggests that vaccines decrease the transmission potential of vaccine recipients who become infected with SARS-CoV-2 by at least half,” wrote three researchers in a recent review. Another team estimated that a single dose of Moderna’s vaccine “reduces the potential for transmission by at least 61 percent, possibly considerably more.”

Even if people get their shots purely to protect themselves, they also indirectly protect their communities. In Israel and the U.S., rising proportions of immunized adults led to plummeting case numbers among children, even though the latter are too young to be vaccinated themselves. “For people who do not get vaccinated and remain vulnerable, their risk is still greatly reduced by the immunity around them,” Justin Lessler, an epidemiologist at Johns Hopkins, told me.

There’s a catch, though. Unvaccinated people are not randomly distributed. They tend to cluster together, socially and geographically, enabling the emergence of localized COVID-19 outbreaks. Partly, these clusters exist because vaccine skepticism grows within cultural and political divides, and spreads through social networks. But they also exist because decades of systemic racism have pushed communities of color into poor neighborhoods and low-paying jobs, making it harder for them to access health care in general, and now vaccines in particular.

“This rhetoric of personal responsibility seems to be tied to the notion that everyone in America who wants to be vaccinated can get a vaccine: You walk to your nearest Walgreens and get your shot,” Gavin Yamey, a global-health expert at Duke, told me. “The reality is very different.” People who live in poor communities might not be near vaccination sites, or have transportation options for reaching one. Those working in hourly jobs might be unable to take time off to visit a clinic, or to recover from side effects. Those who lack internet access or regular health-care providers might struggle to schedule appointments. Predictably, the new pockets of immune vulnerability map onto old pockets of social vulnerability.

According to a Kaiser Family Foundation survey, a third of unvaccinated Hispanic adults want a vaccine as soon as possible—twice the proportion of unvaccinated whites. But 52 percent of this eager group were worried that they might need to miss work because of the reputed side effects, and 43 percent feared that getting vaccinated could jeopardize their immigration status or their families’. Unsurprisingly then, among the states that track racial data for vaccinations, just 32 percent of Hispanic Americans had received at least one dose by May 24, compared with 43 percent of white people. The proportion of at least partly vaccinated Black people was lower still, at 29 percent. And as Lola Fadulu and Dan Keating reported in The Washington Post, Black people now account for 82 percent of COVID-19 cases in Washington, D.C., up from 46 percent at the end of last year. The vaccines have begun to quench the pandemic inferno, but the remaining flames are still burning through the same communities who have already been disproportionately scorched by COVID-19—and by a much older legacy of poor health care.

For unvaccinated people, the pandemic’s collective problem not only persists, but could deepen. “We’re entering a time when younger children are going to be the biggest unvaccinated population around,” Lessler told me. Overall, children are unlikely to have severe infections, but that low individual risk is still heightened by social factors; it is telling that more than 75 percent of the children who have died from COVID-19 were Black, Hispanic, or Native American. And when schools reopen for in-person classes, children can still spread the virus to their families and communities. “Schools play this fairly unique role in life,” Lessler said. “They’re places where a lot of communities get connected up, and they give the virus the ability, even if there’s not much transmission happening, to make its way from one pocket of unvaccinated people to another.”

Schools aren’t helpless. Lessler has shown that they can reduce the risk of seeding community outbreaks by combining several protective measures, such as regular symptom screenings and masks for teachers, and trying their use to community incidence. But he worries that schools might instead pull back on such measures, whether in reaction to the CDC’s new guidance or because of complacency about an apparently waning pandemic. He worries, too, that complacency may be commonplace. Yes, vaccines substantially lower the odds that people will spread the virus, but those nonzero odds will creep upward if other protective measures are widely abandoned. The onset of cooler weather in the fall might increase them further. So might the arrival of new variants.

The Alpha variant of the new coronavirus (B.1.1.7, now the most common U.S. lineage) can already spread more easily than the original virus. The Delta variant (B.1.617.2, which has raised concerns after becoming dominant in the U.K. and India) could be more transmissible still. An assessment from the U.K. suggests that a single vaccine dose is less protective against Delta than its predecessors, although two doses are still largely effective. For now, vaccines are still beating the variants. But the variants are pummeling the unvaccinated.

“My biggest concern is that those who are unvaccinated will have a false sense of safety and security as cases drop this summer,” says Joseph Allen, who directs Harvard’s Healthy Buildings program. “It might feel like the threat has fully diminished if this is in the news less often, but if you’re unvaccinated and you catch this virus, your risk is still high.” Or perhaps higher: In the U.S., unvaccinated people might be less likely to encounter someone infectious. But on each such encounter, their odds of catching COVID-19 are now greater than they were last year.

When leaders signal to vaccinated people that they can tap out of the collective problem, that problem is shunted onto a smaller and already overlooked swath of society. And they do so myopically. The longer rich societies ignore the vulnerable among them, and the longer rich nations neglect countries that have barely begun to vaccinate their citizens, the more chances SARS-CoV-2 has to evolve into variants that spread even faster than Delta, or—the worst-case scenario—that finally smash through the vaccines’ protection. The virus thrives on time. “The longer we allow the pandemic to rage, the less protected we’ll be,” Morehouse’s Camara Jones says. “I think we’re being a bit smug about how well protected we are.”

Ian Mackay, a virologist at the University of Queensland, famously imagined pandemic defenses as layers of Swiss cheese. Each layer has holes, but when combined, they can block a virus. In Mackay’s model, vaccines were the last layer of many. But the U.S. has prematurely stripped the others away, including many of the most effective ones. A virus can evolve around a vaccine, but it cannot evolve to teleport across open spaces or punch its way through a mask. And yet, the country is going all in on vaccines, even though 48 percent of Americans still haven’t had their first dose, and despite the possibility that it might fall short of herd immunity. Instead of asking, “How do we end the pandemic?” it seems to be asking, “What level of risk can we tolerate?” Or perhaps, “Who gets to tolerate that risk?”

Consider what happened in May, after the CDC announced that fully vaccinated people no longer needed to wear masks in most indoor places. Almost immediately, several states lifted their mask mandate. At least 24 have now done so, as have many retailers including Walmart, McDonald’s, Starbucks, Trader Joe’s, and Costco, which now rely on the honor system. The speed of these changes was surprising. When The New York Times surveyed 570 epidemiologists a few weeks before the announcement, 95 percent of them predicted that Americans would need to continue wearing masks indoors for at least half a year.

Some public-health experts have defended the CDC’s new guidance, for at least four reasons. They say that the CDC correctly followed the science, that its new rules allow for more flexibility, that it correctly read the pulse of a fatigued nation, and that it may have encouraged vaccination (although Walensky has denied that this was the CDC’s intention). In sum, vaccinated people should know that they are safe, and act accordingly. By contrast, others feel that the CDC abrogated one of its primary responsibilities: to coordinate safety across the entire population.

In the strictest sense, the CDC’s guidance is accurate; vaccinated people are very unlikely to be infected with COVID-19, even without a mask. “You can’t expect the CDC to not share their scientific assessment because the implications have problems,” Ashish Jha, who heads the Brown University School of Public Health, told me. “They have to share it.” Harvard’s Joseph Allen agrees, and notes that the agency clearly stated that unvaccinated people should continue wearing masks indoors. And having some flexibility is useful. “You can’t have 150 million people who are vaccinated and ready to get back to some semblance of what they’re used to, and not have this tension in the country,” he told me. The new guidelines also move the U.S. away from top-down mandates, recognizing that “decisions are rightly shifting to the local level and individual organizations,” Allen wrote in The Washington Post. If some organizations and states pulled their mask mandate too early, he told me, “that’s an issue not with the CDC but with how people are acting based on its guidance.”

It’s true, too, that the CDC is in a difficult position. It had emerged from a year of muzzling and interference from the Trump administration, and was operating in a climate of polarization and public fatigue. “When agencies are putting out recommendations that people aren’t following, that undermines their credibility,” Jha told me. “The CDC, as a public-health agency, must be sensitive to where the public is.” And by May, “there was a sense that mask mandates were starting to topple.”

But that problem—that collective behavior was starting to change against collective interest—shows the weaknesses of the CDC’s decisions. “Science doesn’t stand outside of society,” Cecília Tomori, an anthropologist and a public-health scholar at Johns Hopkins, told me. “You can’t just ‘focus on the science’ in the abstract,” and especially not when you’re a federal agency whose guidance has been heavily politicized from the get-go. In that context, it was evident that the new guidance “would send a cultural message that we don’t need masks anymore,” Tomori said. Anticipating those reactions “is squarely within the expertise of public health,” she added, and the CDC could have clarified how its guidelines should be implemented. It could have tied the lifting of mask mandates to specific levels of vaccination, or the arrival of worker protections. Absent that clarity, and with no way for businesses to even verify who is vaccinated, a mass demasking was inevitable. “If you’re blaming the public for not understanding the guidance—wow,” Duke’s Gavin Yamey said. “If people have misunderstood your guidance, your guidance was poor and confusing.”

Meanwhile, the idea that the new guidance led to more vaccinations is likely wrong. “I’ve overseen close to 10,000 people being vaccinated, and I’ve yet to hear ‘I can take the mask off’ as a reason,” Theresa Chapple-McGruder, a local-health-department director, told me. Although visits to the site vaccines.gov spiked after the CDC’s announcement, actual vaccination rates increased only among children aged 12 to 15, who had become eligible the day before. Meanwhile, a KFF survey showed that 85 percent of unvaccinated adults felt that the new guidance didn’t change their vaccination plans. Only 10 percent said they were more likely to get vaccinated, while 4 percent said they were less likely. Vaccination rates are stuck on a plateau.

Creating incentives for vaccination is vital; treating the removal of an important protective measure as an incentive is folly. The latter implicitly supports the individualistic narrative that masks are oppressive burdens “that people need to get away from to get back to ‘normal,’” Rhea Boyd, a pediatrician and public-health advocate from the Bay Area, told me. In fact, they are an incredibly cheap, simple, and effective means of collective protection. “The pandemic made clear that the world is vulnerable to infectious disease and we should normalize the idea of precaution, as we see in other countries that have faced similar epidemics,” Boyd said. “But recommendations like this say, This is something we put behind us, rather than something we put in our back pocket.”

Collective action is not impossible for a highly individualistic country; after all, a majority of Americans used and supported masks. But such action erodes in the absence of leadership. In the U.S., only the federal government has the power and financial freedom to define and defend the collective good at the broad scales necessary to fight a pandemic. “Local public health depends on guidance from the federal level,” Chapple-McGruder said. “We don’t make local policies that fly in the face of national guidance.” Indeed, the CDC’s guidance prompted some local leaders to abandon sensible strategies: North Carolina’s governor had planned to lift COVID-19 restrictions after two-thirds of the state had been vaccinated, but did so the day after the CDC’s announcement, when only 41 percent had received their first dose. Meanwhile, Iowa and Texas joined Florida in preventing cities, counties, schools, or local institutions from issuing mask mandates. Rather than ushering in an era of flexibility, the CDC has arguably triggered a chain of buck-passing, wherein responsibility for one’s health is once again shunted all the way back to individuals. “Often, Let everyone decide for themselves is the easiest policy decision to make, but it’s a decision that facilitates spread of COVID in vulnerable communities,” Julia Raifman, a health-policy researcher at Boston University, told me.

The CDC’s own website lists the 10 essential public-health services—a set of foundational duties arranged in a colorful wheel. And at the center of that wheel, uniting and underpinning everything else, is equity—a commitment to “protect and promote the health of all people in all communities.” The CDC’s critics say that it has abandoned this central tenet of public health. Instead, its guidelines centered people who had the easiest and earliest access to vaccines, while overlooking the most vulnerable groups. These include immunocompromised people, for whom the shots may be less effective; essential workers, whose jobs place them in prolonged contact with others; and Black and Latino people, who are among the most likely to die of COVID-19 and the least likely to have been vaccinated.

During a pandemic, “someone taking all the personal responsibility in the world may still be affected by a lack of coordinated safety,” Raifman said. “They may be vaccinated but less protected because they are immunosuppressed and get the disease working in a grocery store amidst unmasked people. They may have a child who cannot be vaccinated, and miss work if that child gets COVID.” As Eleanor Murray, an epidemiologist at Boston University, said on Twitter, “Don’t tell me it’s “safe”; tell me what level of death or disability you are implicitly choosing to accept.” When Rochelle Walensky said, “It’s safe for vaccinated people to take off their masks,” she was accurate, but left unaddressed other, deeper questions: How much additive burden is a country willing to foist upon people who already carry their disproportionate share? What is America’s goal—to end the pandemic, or to suppress it to a level where it mostly plagues communities that privileged individuals can ignore?

“When you’re facing an epidemic, the responsibility of public health is to protect everybody, but those made vulnerable first,” Boyd, the pediatrician, told me. “If you have protection, the CDC is glad for you, but their role is not the same for you. Their role is to keep those most at risk of infection and death from exposure.”

America is especially prone to the allure of individualism. But that same temptation has swayed the entire public-health field throughout its history. The debate about the CDC’s guidance is just the latest step in a centuries-old dance to define the very causes of disease.

In the early 19th century, European researchers such as Louis-René Villermé and Rudolf Virchow correctly recognized that disease epidemics were tied to societal conditions like poverty, poor sanitation, squalid housing, and dangerous jobs. They understood that these factors explain why some people become sick and others don’t. But this perspective slowly receded as the 19th century gave way to the 20th.

During those decades, researchers confirmed that microscopic germs cause infectious diseases, that occupational exposures to certain chemicals can cause cancers, that vitamin deficiencies can lead to nutritional disorders like scurvy, and that genetic differences can lead to physical variations among people. “Here … was a world in which disease was caused by germs, carcinogens, vitamin deficiencies, and genes,” wrote the epidemiologist Anthony J. McMichael in his classic 1999 paper, “Prisoners of the Proximate.” Public health itself became more individualistic. Epidemiologists began to see health largely in terms of personal traits and exposures. They became focused on finding “risk factors” that make individuals more vulnerable to disease, as if the causes of sickness play out purely across the boundaries of a person’s skin.

“The fault is not in doing such studies, but in only doing such studies,” McMichael wrote. Liver cirrhosis, for example, is caused by alcohol, but a person’s drinking behavior is influenced by their culture, occupation, and exposure to advertising or peer pressure. The distribution of individual risk factors—the spread of germs, the availability of nutritious food, one’s exposure to carcinogens—is always profoundly shaped by cultural and historical forces, and by inequities of race and class. “Yet modern epidemiology has largely ignored these issues of wider context,” McMichael wrote.

“The field has moved forward since then,” Nancy Krieger, a social epidemiologist at Harvard told me. Epidemiology is rediscovering its social side, fueled by new generations of researchers who don’t come from traditional biomedical backgrounds. “When I started out in the mid-1980s, there were virtually no sessions [at academic conferences] about class, racism, and health in the U.S.” Krieger said. “Now they’re commonplace.” But these connections have yet to fully penetrate the wider zeitgeist, where they are still eclipsed by the rhetoric of personal choice: Eat better. Exercise more. Your health is in your hands.

This is the context in which today’s CDC operates, and against which its choices must be understood. The CDC represents a field that has only recently begun to rebalance itself after long being skewed toward individualism. And the CDC remains a public-health agency in one of the most individualistic countries in the world. Its mission exists in tension with its environment. Its choice to resist that tension or yield to it affects not only America’s fate, but also the soul of public health—what it is and what it stands for, whom it serves and whom it abandons.