Itemoids

Public Health

Columbia University’s Anti-Semitism Problem

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 03 › columbia-antisemitism-israel-palestine-trump › 682054

This story seems to be about:

In January, when the historian Avi Shilon returned to Columbia University from winter break, a thought coursed through his mind: If calm can take hold in Gaza, then perhaps it could also happen in Morningside Heights. Just a few days earlier, in time for the start of the semester, Hamas and Israel had brokered a cease-fire in their war.

Over the many months of that war, Columbia was the site of some of America’s most vitriolic protests against Israel’s actions, and even its existence. For two weeks last spring, an encampment erected by anti-Israel demonstrators swallowed the fields in the center of the compact Manhattan campus. Nobody could enter Butler Library without hearing slogans such as “Globalize the intifada!” and “We don’t want no Zionists here!” and “Burn Tel Aviv to the ground!” At the end of April, students, joined by sympathizers from outside the university gates, stormed Hamilton Hall—which houses the undergraduate-college deans’ offices—and then battled police when they sought to clear the building. Because of the threat of spiraling chaos, the university canceled its main commencement ceremony in May.

Shilon felt that the tamping of hostilities in Gaza made the moment ripe for the course he was scheduled to teach, “History of Modern Israel,” which would examine the competing Jewish and Palestinian narratives about his native country’s founding.

But Columbia soon disabused him of his hopes. About 30 minutes into the first session of his seminar, four people, their faces shrouded in keffiyehs, burst into his classroom. A protester circled the seminar table, flinging flyers in front of Shilon’s students. One flyer bore an image of a boot stomping on a Star of David; another stated, The Enemy Will Not See Tomorrow.

In the Israeli universities where Shilon had studied and taught, he was accustomed to strident critiques of the country. Sometimes he even found himself sympathizing with them. Taking up difficult arguments struck him as the way to navigate tense disagreements, so he rose from his chair and gingerly approached the protesters. “You’re invited to learn,” he told them.

But the protesters ignored him. As one held up a camera to film, another stared at it and delivered a monologue in which she described Shilon’s class—which had barely progressed beyond a discussion of expectations for the semester—as an example of “Columbia University’s normalization of genocide.”

After she finished her speech, the demonstrators left the room, but a sense of intrusion lingered. Columbia University Apartheid Divest, the umbrella group that organized protests on campus, posted a video of the action, with the caption: “We disrupted a zionist class, and you should too.” The university later offered to provide security for Shilon’s class because it couldn’t be sure if CUAD was bluffing.

Over the past two years, Columbia’s institutional life has become more and more absurd. Confronted with a war on the other side of the world, the course of which the university has zero capacity to affect, a broad swath of the community acted as if the school’s trustees and administrators could determine the fate of innocent families in Gaza. To force the university into acceding to demands—ending study abroad in Israel, severing a partnership with Tel Aviv University, divesting from companies with holdings in Israel––protesters attempted to shut down campus activity. For the sake of entirely symbolic victories, they were willing to risk their academic careers and even arrest.

Because the protesters treated the war as a local issue, they trained their anger on Jewish and Israeli students and faculty, including Shilon, some of whom have been accused of complicity with genocide on the basis of their religious affiliation or national origin. More than any other American university, Columbia experienced a breakdown in the fabric of its community that demanded a firm response from administrators—but these administrators tended to choke on their own fears.

Many of the protesters followed university rules governing demonstrations and free expression. Many others did not. Liberal administrators couldn’t or wouldn’t curb the illiberalism in their midst. By failing to discipline protesters who transgressed university rules, they signaled that disrupting classrooms carried no price. By tolerating professors who bullied students who disagreed with them, they signaled that incivility and even harassment were acceptable forms of discourse.

It was as if Columbia was reliving the bedlam of 1968, which included a student takeover of the university and scarred the institution for decades. And just like in the Vietnam era, the university became a ripe target for demagogues on the right, who are eager to demolish the prestige of elite higher education. And now that Donald Trump and his allies control the federal government, they have used anti-Semitism as a pretext for damaging an institution that they abhor. In the name of rescuing the Jews of Columbia, the Trump administration cut off $400 million in federal contracts and grants to the university. Trump officials then sent a letter demanding—as preconditions for restoring the funds—a series of immediate, far-reaching steps, including suspending and expelling Hamilton Hall protesters, producing a plan to overhaul admissions, and putting the school’s Department of Middle Eastern, South Asian, and African Studies under “academic receivership.”

Mark Rudd, president of Students for a Democratic Society, addresses students at Columbia University in May 1968. (Hulton Archive / Getty)

And in an attempt to suppress political views it dislikes, the administration authorized the unlawful detention of Mahmoud Khalil, an alumnus who helped organize campus protests, and sent federal agents to search two dorm rooms. Another graduate student, targeted by Immigration and Customs Enforcement, fled to Canada rather than risk apprehension. The Trump administration’s war on Columbia stands to wreck research, further inflame tensions on campus, and destroy careers—including, in a supreme irony, those of many Jewish academics, scientists, physicians, and graduate students whom the administration ostensibly wants to protect.

Trump’s autocratic presence unbalances every debate. But just because his administration is exploiting the issue of anti-Semitism does not mean that anti-Jewish activism is not an issue at Columbia. Somewhere along the way, one of the nation’s greatest universities lost its capacity to conduct intellectual arguments over contentious issues without resorting to hyperbole and accusations of moral deficiency.

On Israel, the issue that most sharply divides Columbia, such accusations took a sinister cast. Jewish students faced ostracism and bullying that, if experienced by any other group of students  on campus, would be universally regarded as unacceptable. It was a crisis that became painfully evident in the course of the war over Gaza, but it didn’t begin with the war, and it won’t end with it.

The story of American Jewry can be told, in part, by the history of Columbia’s admissions policy. At the turn of the 20th century, when entry required merely passing an exam, the sons of Jewish immigrants from Eastern Europe began rushing into the institution. By 1920, Columbia was likely 40 percent Jewish. This posed a marketing problem for the school, as the children of New York’s old knickerbocker elite began searching out corners of the Ivy League with fewer Brooklyn accents.

To restore Anglo-Saxon Protestant demographic dominance, university president Nicholas Murray Butler invented the modern college-application process, in which concepts such as geographic diversity and a well-rounded student body became pretexts to weed out studious Jews from New York City. In 1921, Columbia became the first private college to impose a quota limiting the number of Jews. (In the ’30s, Columbia rejected Richard Feynman, who later won a Nobel Prize in physics, and Isaac Asimov, the great science fiction writer.) Columbia, however, was intent on making money off the Jews it turned away, so to educate them, it created Seth Low Junior College in Brooklyn, a second-rate version of the Manhattan institution.

Only after World War II, when America fought a war against Nazism, did this exclusionary system wither away. When I attended Columbia for four blissful years, a generation or so ago, the school was a Jewish wonderland, where I first encountered the pluralism of American Jewish life. I became friends with red-diaper babies, kids raised in Jewish socialist families. I dated an Orthodox woman who had converted from evangelical Christianity. Several floors of my dorm had been nicknamed Anatevka, after the shtetl in Fiddler on the Roof; they had kosher kitchens, and on the Sabbath, the elevators would automatically stop on each of those floors. I studied Yiddish with a doyenne of the dying Yiddish theater and attended lectures with Yosef Yerushalmi, one of the great Jewish historians of his generation. At Columbia, for the first time in my life, I felt completely at home in my identity.

I also imbibed the university’s protest culture: I briefly helped take over Hamilton Hall in the name of preserving the Audubon Ballroom, the Upper Manhattan site of Malcolm X’s assassination. Columbia wanted to convert the building into a research center. The leader of our movement, Benjamin Jealous, who went on to head the NAACP, was suspended for his role; I was put on probation.

Nostalgia, however, is a distorting filter. Long before the October 7 attack by Hamas on southern Israel that sparked the subsequent invasion of Gaza, there were accusations of anti-Semitism on campus. I tended to wish them away, but after the Hamas attack, the evidence kept walloping me.

Although protests against Israel erupted on many campuses after October 7, the collision between Zionists and anti-Zionists was especially virulent at Columbia. Less than a week after the attack, a woman was arrested in front of the library for allegedly beating an Israeli student who was hanging posters of hostages held in Gaza. (The Manhattan district attorney found that the woman hadn’t intentionally hit the student and dismissed the case after she apologized and agreed to counseling.)

Soon after the war in Gaza began, the Columbia Daily Spectator interviewed more than 50 Jewish students about their experiences: 13 told the student newspaper that they had been attacked or harassed; 12 admitted that they had obscured markers of their Jewish identity, tucking away Star of David necklaces and hiding kippot under caps to avoid provoking the ire of fellow students.

To Columbia’s misfortune, the university had a new president, Minouche Shafik, who’d arrived by way of the London School of Economics. Any leader would have been overwhelmed by the explosion of passions, but she seemed especially shell-shocked by the rancor—and how it attracted media, activists, and politicians, all exploiting the controversy for their own purposes. Panicked leaders, without any clear sense of their own direction, have a rote response: They appoint a task force. And in November 2023, Shafik appointed some of Columbia’s most eminent academics to assess the school’s anti-Semitism problem. (Shafik had hoped to have a parallel task force on Islamophobia, but Rashid Khalidi, a Columbia historian and the most prominent Palestinian scholar in the country, called the idea a “fig leaf to pretend that they are ‘balanced,’” and the idea never hatched.)

In “listening sessions” with students, task-force members heard one recurring complaint: that administrators were strangely indifferent to Jewish students complaining about abuse. Rather than investigating incidents, some administrators steered Jewish students to mental-health counseling, as if they needed therapy to toughen them up. Students who had filed official reports of bias with the university claimed that they’d never heard back. (To protect the privacy of listening-session participants, the task force never confirmed specific instances, but it deemed the complaints credible.)

Perhaps, early on, one could imagine benign explanations for the weak response. But in June, as the task force went about its investigation, The Washington Free Beacon reported on a series of text messages fired off by four Columbia deans as they attended a panel on Jewish life at Columbia. (A panel attendee who had sat behind one of the administrators had surreptitiously photographed the text thread over her shoulder.) Instead of sympathetically listening to panelists discuss anti-Semitism, the deans unwittingly confirmed the depth of the problem. These officials, whose role gave them responsibility for student safety, snarkily circulated accusations about the pernicious influence of Jewish power. “Amazing what $$$ can do,” one of the deans wrote. Another accused the head of campus Hillel of playing up complaints for the sake of fundraising. “Comes from such a place of privilege,” one of them moaned. After the Free Beacon published the screenshots, Columbia suspended three of the administrators. Not long after, they resigned.

A month later, at the beginning of the academic year, the task force published a damning depiction of quotidian student life. An especially powerful section of the report described the influence of Columbia University Apartheid Divest, the organizer of the anti-Israel protests. CUAD was a coalition of 116 tuition-supported, faculty-advised student groups, including the university mariachi band and the Barnard Garden Club.

CUAD doesn’t simply oppose war and occupation; it endorses violence as the pathway to its definition of liberation. A year ago, a Columbia student activist told an audience watching him on Instagram, “Be grateful that I’m not just going out and murdering Zionists.” At first, CUAD dissociated itself from the student. But then the group reconsidered and apologized for its momentary lapse of stridency. “Violence is the only path forward,” CUAD said in an official statement. That wasn’t a surprising admission; its public statements regularly celebrate martyrdom.

When groups endorsed CUAD, they forced Jewish students to confront a painful choice. To participate in beloved activities, they needed to look past the club’s official membership in an organization that endorsed the killing of Jews and the destruction of the world’s only Jewish-majority country.

According to the task force, complaining about the alliance with CUAD or professing sympathy for Israel could lead to a student being purged from an extracurricular activity. When a member of the dance team questioned the wisdom of supporting CUAD, she was removed from the organization’s group chats and effectively kicked off the team. A co-president of Sewa, a Sikh student group, says that she was removed from her post because of her alleged Zionism. In an invitation to a film screening, the founder of an LGBTQ group, the LezLions, wrote, “Zionists aren’t invited.”

I’m not suggesting that Jews at Columbia feel constantly under siege. When I gave a speech at the campus Hillel group last spring, many members, even some who are passionate supporters of Israel, told me that they are happy at Columbia and have never personally experienced anything resembling anti-Semitism. The pro-Palestinian encampments included Jewish protesters, some of whom received abuse from their fellow Jews. To the task force’s credit, its report acknowledges many such complexities, but it brimmed with accounts of disturbing incidents worthy of a meaningful official response. Unfortunately, that’s not the Columbia way.

Had I been wiser as an undergrad, I could have squinted and seen the roots of the current crisis. In the 1990s, Israel was a nonissue on campus: The Oslo peace process was in high gear, and a two-state solution and coexistence were dreams within reach. But the most imposing academic celebrity on campus was the Jerusalem-born Edward Said, a brilliant professor of literature, who had served as a member of the Palestine Liberation Organization’s legislative arm.

During my years at Columbia, Said, who was battling cancer, was a remote figure. A dandy who loved his tweeds and was immersed in the European cosmopolitanism that he critiqued, he taught only a course on Giuseppe Verdi and imperialism.

Still, he bestrode the university. His masterwork, Orientalism, was one of the few books by an active Columbia professor regularly included in the college’s core curriculum. That book, by the university’s most acclaimed professor, was also a gauntlet thrown in the community’s face. Said had convincingly illustrated how racism infected the production of knowledge in Middle Eastern studies. Even if scholarship paraded as the disinterested study of foreign cultures, it was inherently political, too often infected by a colonialist mindset.

To correct for that bias, admirers of Said’s book concluded, universities needed to hire a different style of academic, including scholars with roots in the region they studied, not just a bunch of white guys fascinated by Arabs. The Middle Eastern–studies department filled with Said protégés, who lacked his charm but taught with ferocious passion. Because they were unabashed activists, these new scholars had no compunction about, say, canceling class so that students could attend pro-Palestinian rallies.

Joseph Massad, a Jordanian-born political scientist who wrote a history of nationalism in his native country, became the most notorious of the new coterie soon after arriving in 1999. His incendiary comments provoked his ideological foes to respond with fury and, sometimes, to unfairly twist his quotes in the course of their diatribes. But his actual record was clear enough. Writing in the Egyptian newspaper Al-Ahram in 2003, he accused the Israelis of being the true anti-Semites, because they destroyed the culture of the Jewish diaspora; the Palestinians were the real Jews, he argued, because they were being massacred.

Violence, when directed at Jews, never seemed to bother him. This moral vacuity was on full display in the column he wrote in response to October 7, which he called a “resistance offensive,” for The Electronic Intifada, a Chicago-based publication aligned with the more radical wing of the Palestinian cause. His essay used a series of euphoric adjectives—“astonishing,” “astounding,” “awesome”—to describe Hamas’s invasion, without ever condemning, let alone mentioning, the gruesome human toll of the massacre, which included rape and the kidnapping of babies. In fact, he coldly described the towns destroyed by Hamas as “settler-colonies.”

Massad has long been accused of carrying that polemical style into the classroom. In the course description for a class called “Palestinian and Israeli Politics and Societies,” he wrote in 2002: “The purpose of the course is not to provide a ‘balanced’ coverage of the views of both sides.” On the one hand, that’s an admirable admission. On the other hand, Jewish students complained that he treated those with dissenting opinions as if they were moral reprobates, unworthy of civility.

In 2004, a pro-Israel group in Boston put together a low-budget documentary called Columbia Unbecoming, which strung together student testimony about the pedagogical style of Columbia’s Middle Eastern–studies program. To take two representative incidents: After an Israeli student asked Massad a question at an extracurricular event, the professor demanded to know how many Palestinians he had killed; a woman recounted how another professor, George Saliba, had told her not to opine on Israel-Palestine questions because her green eyes showed that she couldn’t be a “Semite.”

In response, Massad denied ever meeting the Israeli student; Saliba wrote that he didn’t recall the green-eyes comments and that the student might have misconstrued what he was saying. But Columbia’s then-president, Lee Bollinger, instantly recognized the problem and appointed his own task force to examine the complaints. But it would have taken more than a task force to address the underlying problem. The emerging style of the American academy, especially prevalent at Columbia, viewed activism flowing from moral absolutes as integral to the mission of the professoriat. But a style that prevailed in African American–studies and gender-studies departments was incendiary when applied to Israel. With race and gender, there was largely a consensus on campus, but Israel divided the university community. And as much as Bollinger professed to value dissenting opinions, his university was ill-equipped to accommodate two conflicting points of view. And the gap between those two points of view kept growing, as Said’s legacy began to seep into even the far reaches of Columbia.

If I were writing a satiric campus novel about Columbia, I would have abandoned the project on January 29. That’s the day the Spectator published lab notes for an introductory astronomy course, written by a teaching assistant, that instructed students: “As we watch genocide unfold in Gaza, it is also important to tell the story of Palestinians outside of being the subjects of a military occupation. Take 15 minutes or so to read through the articles ‘Wonder and the Life of Palestinian Astronomy’ and ‘In Gaza, Scanning the Sky for Stars, Not Drones.’ Remind yourself that our dreams, our wonders, our aspirations … are not any more worthy.” At Columbia, a student couldn’t contemplate the Big Dipper without being forced to consider the fate of Khan Yunis.

This was a minor scandal, but a representative one. Over the years, the subject of Israel became nearly inescapable at Columbia, even in disciplines seemingly far removed from Gaza. For a swath of graduate students and professors, Palestinian liberation—and a corollary belief that Israel is uniquely evil among nations—became something close to civic religion.

In 2023, at the School of Public Health, a professor who taught a section of its core curriculum to more than 400 students denounced Jewish donors to the university as “wealthy white capitalists” who laundered “blood money” through the school. He hosted a panel on the “settler-colonial determinants of health” that described “Israel-Palestine” as a primary example of a place where the “right to health” can never be realized. Several years ago, the Graduate School of Architecture, Preservation and Planning offered a class on “Architecture and Settler Colonialism” and hosted an event titled “Architecture Against Apartheid.”

By insisting that Israel is the great moral catastrophe of our age, professors and graduate students transmitted their passions to their classes. So it is not surprising that Jewish students with sympathy for Israel found themselves subject to social opprobrium not just from their teachers, but also from their peers. In its September report, the task force that Shafik had convened described the problem starkly: “We heard about students being avoided and avoiding others” and about “isolation and even intimidation in classrooms, bullying, threats, stereotypes, ethnic slurs, disqualification from opportunities, fear of retaliation and community erosion.” This was the assessment of Columbia professors, many of them unabashed liberals, who risked alienating colleagues by describing the situation bluntly.

Pro-Palestinian protesters march around Columbia in April 2024. (Michael M. Santiago / Getty)

In September, the task force presented its findings to Columbia’s University Senate, an elected deliberative body that brings faculty, administrators, and students into the governance of the institution. Its creation was a utopian response to the 1968 protests. But the senate session about anti-Semitism was a fiasco. Almost from the start, members began to attack the task-force report’s motives and methodology—even its focus on discrimination against Jews. “No such resources were put into covering anybody else’s subjective experience on this campus,” the English professor Joseph Slaughter said, “and I think that creates real problems for the community.” The hostility to the report wasn’t meaningless fulmination; it was evidence of how a large part of the faculty was determined to prevent the university from acknowledging the presence of anti-Jewish activity in the school.

No other university has a governance structure quite like Columbia’s, and for good reason. Most academics with busy lives want to avoid endless meetings with their colleagues, so most professors aren’t rushing to join the senate. In recent years, the senate has attracted those of an activist bent, who are willing to put up with tedium in service of a higher cause. Two members of the rules committee were allegedly part of a faculty contingent that stood guard around the encampments on the quad. They did so even though they had jurisdiction over potentially disciplining those protesters. As it happens, exceedingly few of the protesters who flagrantly disregarded university rules have suffered any consequences for their actions. Columbia didn’t impose discipline on students who stormed Hamilton Hall last spring—at least not until last week, amid Trump’s threat of drastic cuts to the university. But by then, a culture of impunity was firmly rooted.

Barnard College is integrated into Columbia, but it has its own set of rules, its own governance structure and disciplinary procedures. And it acted swiftly to expel two of the students who were in the group that burst into Avi Shilon’s class in January. (Columbia had suspended another participant, pending an investigation, and failed to identify the other.) For once, it felt as if the university was upholding its basic covenant with its students: to protect the sanctity of the classroom.

But instead of changing anyone’s incentives, Barnard’s hard-line punishment inspired protesters to rush Millbank Hall, banging drums and chanting, “There is only one solution, intifada revolution.” In the course of storming the building, they allegedly assaulted a Barnard employee, sending him to the hospital. For more than six hours, they shut down the building, which houses the offices of the administration, and left only after the college threatened to bring in the police and offered an official meeting with the protesters. But the possibility of police action wasn’t a sufficient deterrent, because a week later, two dozen protesters returned to occupy Barnard’s library.

In some deep sense, the university had lost the capacity to reassert control, let alone confront the root causes of the chaos. And looking back over the past few months, I see a pattern of events that, in some ways, is far more troubling than the encampments that received so many headlines. In November, protesters descended on the building that houses Hillel, the center of Jewish life on campus—its main purpose is to provide Jewish students with religious services and kosher food—and demanded that the university sever ties with the organization. The next month, a demonstrator marching up Broadway punched a kippah-wearing Jew in the face. In January, to memorialize the murder of a Palestinian girl, protesters filled the toilets of the School of International and Public Affairs with cement. Skewering two Jewish women affiliated with the school—its dean, Keren Yarhi-Milo, and an adjunct assistant professor at the school, Rebecca Weiner—they spray-painted the message “Keren eat Weiner,” with an image of feces.

All of this unfolded as the Trump administration launched an assault on higher education. But thus far, Columbia students haven’t bothered to protest that. Unlike Palestine, which for most students is a distant cause, the stripping of federal funding for the institution will ripple through the lives of students and faculty. But university activism has its sights obsessively locked on Israel.

That Trump assault on Columbia has now arrived, in the heaviest-handed form. Anti-Semitism on campus, a problem that merits a serious response, has been abused in the course of Trump’s quest to remake America in his image. Tellingly, the administration’s withholding of federal grants will fall hardest on the hard sciences, which are the part of the university most immune to anti-Semitism, and hardly touch the humanities, where overwrought criticisms of Israel flourish.

The indiscriminate, punitive nature of Trump’s meddling may unbalance Columbia even further. A dangerous new narrative has emerged there and on other campuses: that the new federal threats result from “fabricated charges of antisemitism,” as CUAD recently put it, casting victims of harassment as the cunning villains of the story. In this atmosphere, Columbia seems unlikely to reckon with the deeper causes of anti-Jewish abuse on its campus. But in its past—especially in its history of overcoming its discriminatory treatment of Jews—the institution has revealed itself capable of overcoming its biases, conscious and otherwise, against an excluded group. It has shown that it can stare hard at itself, channel its highest values, and find its way to a better course.

America Is Botching Measles

The Atlantic

www.theatlantic.com › health › archive › 2025 › 03 › america-measles-response-rfk-texas › 681967

Until this year, public-health officials have abided by a simple playbook for measles outbreaks: Get unvaccinated people vaccinated, as quickly as possible. The measles component of the measles, mumps, and rubella shot that nearly all American kids receive today is “one of the best vaccines we have,” William Moss, a measles expert at Johns Hopkins Bloomberg School of Public Health, told me. Two doses in early childhood are enough to cut someone’s risk of getting measles by 97 percent. And vaccination is the only surefire way to slow the spread of the wildly contagious disease.

In the weeks since a measles outbreak began in West Texas and spilled into neighboring New Mexico, local health departments have run that play, scrambling to set up free vaccination clinics. The federal government, though, appears to be writing its own rules for the game. The epidemic has already surpassed 200 known cases. But that’s likely a drastic undercount, experts told me. And it appears to have claimed at least two lives, including that of a six-year-old unvaccinated child. And yet, the CDC waited to release its first statement on the outbreak until a month or so after the epidemic began, and even then, it didn’t directly urge parents to get their kids up-to-date on MMR shots.

More recently, the Department of Health and Human Services has called for doses of the vaccine to be shipped to Texas; at the same time, HHS is working on dispatching vitamin A to the region, and the department’s new secretary, Robert F. Kennedy Jr., is overinflating the importance of those supplements in managing measles. In some parts of Texas, vitamin-A-rich cod-liver oil is flying off shelves, while some parents are doubling down on their hesitations over vaccines.

When reached for comment, an HHS spokesperson noted that the “CDC recommends vaccination as the best protection against measles infections,” but added that “Secretary Kennedy and HHS are committed to aggressively handling the measles outbreak with a comprehensive, all-of-the-above approach to do what we can to save lives.”

The United States has long had small groups of people who have opted out of vaccination, but in this outbreak, the first major one of Trump’s second term, the fracture between the unvaccinated and the worried well is looking especially stark. Many of the people most eager to get a shot are the ones who need it least: young, healthy individuals nowhere near a detected outbreak, who already have all the MMR doses they’ll likely ever need. Meanwhile, those who would most benefit from vaccination have been pointedly reminded that doing so is a personal decision, as Kennedy has put it—a framing that could add to the growing death toll.

Before the 1960s, when the measles vaccine became available in the U.S., the virus infected roughly 3 to 4 million Americans each year. In most cases, the disease would resolve after a few days of cough, fever, and a telltale rash. But measles can also quickly turn dangerous, causing complications as severe as pneumonia and brain swelling. Researchers have estimated that the virus can infect 90 percent of the unimmunized people it comes into close contact with, and roughly one out of every 1,000 cases will result in death. One study found that, in the era before the vaccine, up to half of all infectious-disease deaths in kids might have been caused by measles. Those who survive the disease are sometimes left with permanent brain damage; the virus can also warp the immune system, wiping out the body’s memory of past infections and vaccines, which leaves people more vulnerable to disease.

In the U.S., getting measles as a child—and risking all of those horrors—was once considered a grim matter of course. Decades of vaccination helped the U.S. eliminate measles by the year 2000 and keep the virus mostly at bay since then. But the cracks in that wall have been widening. For a virus as contagious as measles, vaccine coverage needs to remain above 95 percent to prevent outbreaks. A drop of even just a couple of percentage points in immunization can double the virus’s attack rate, Mark Jit, an epidemiologist at New York University, told me. In many parts of the U.S., that’s now a real threat: Nationwide, less than 93 percent of kindergartners were fully vaccinated against measles for the 2023–24 school year. Unvaccinated and undervaccinated people also tend to cluster; the outbreak in West Texas, for instance, has hit particularly hard in Gaines County, which is home to a Mennonite community wary of the health-care system, and which has a kindergarten-vaccination rate of just 82 percent.

Vaccine-uptake rates in the region appear to have risen in the weeks since the outbreak began. But several experts told me they were disturbed by the lack of strong, consistent messaging from federal leadership. In the past, outbreaks have prompted immediate vaccine advocacy from the federal government: During a 2018–19 measles outbreak clustered in an Orthodox Jewish community in New York, the CDC director, Robert Redfield, and HHS secretary, Alex Azar, issued joint statements emphasizing the importance of vaccination. Redfield called on health-care providers to “assure patients about the efficacy and safety of the measles vaccine”; Azar stressed that the safety of the shots “has been firmly established over many years in some of the largest vaccine studies ever undertaken.”

People aren’t necessarily taking their first cues from federal appointees. Studies show that health-care-provider recommendations are a major factor in people’s decisions about vaccination. But national messages can still cue local health officials and physicians to double down on their efforts, Robert Bednarczyk, an epidemiologist at Emory University, told me. And some of the most powerful health partnerships can involve community leaders with cachet among families. During the 2018–19 measles outbreak, public-health officials made “deliberate attempts to work with religious leaders,” whose recommendations would be trusted, Moss told me. Those efforts seem lacking in this current outbreak: One pastor in Seminole, Texas, told the Associated Press that he hadn’t received any direct outreach from public-health officials, and wasn’t engaging with parents in his congregation about vaccines.

A statement from HHS released this week did give some emphasis to the potency of measles shots. But it also continued to echo Kennedy’s constant praise of vitamin A as a top-line method to manage the virus, a statement now also highlighted on the CDC website. Vitamin A deficiency can worsen a case of measles that has already begun—but those deficiencies are estimated to affect less than 1 percent of Americans. Kennedy has also called out steroids and antibiotics as measles-fighting tactics, but those interventions, too, are more geared toward reducing the severity of disease once it’s already set in. “The vaccine is the only thing that stops people from getting infected,” Jit told me. And casting supplements and drugs as comparable to, or even preferable to, prevention is especially dangerous for a disease that has no cure or antivirals.

When vaccination is framed as just one option among many, parents might think twice about opting into a shot for their kid, Rupali Limaye, a health-communications scholar at George Mason University, told me. Kennedy has also framed vaccination as a personal choice, and something about which parents should “consult with their healthcare providers to understand their options”—all couches, Limaye told me, that make doing nothing an especially convenient option. “That, to me, is automatically going to lead to more morbidity and more mortality,” she said.

Meanwhile, primary-care physicians such as Sarah Turner of Lutheran Health Physicians, in Indiana, are getting frantic questions from patients who are asking if they need boosters or early shots for their infants, not yet old enough for their first MMR. Although some people, those born before 1989, may have received only one dose of MMR in childhood and be eligible for another, in most instances, the answer to those questions is usually no—especially if local cases haven’t been detected and a family isn’t planning to travel into an area where measles is rampant, Turner told me. Protection from the vaccine is thought to last for decades, maybe even an entire lifetime, in most people: “If you have two doses of measles vaccine, you don’t have anything to worry about,” Shelly Bolotin, an epidemiologist at the University of Toronto, told me.

These misalignments are a pattern in the U.S.’s reaction to infectious disease. During the worst days of COVID, too, politicians hocked dubious treatments for the virus, and anti-vaccine sentiment roiled in underimmunized communities, while some of the worried well sought out extra shots before it was clear they needed them. More recently, when mpox cases began to explode in the United States in 2022, a broad swath of Americans clamored for shots—even though the outbreak was, from the start, concentrated among men who have sex with men, who didn’t receive focused resources for weeks. And as H5N1 cases in the U.S. have risen over the past year, public worry has concentrated on the safety of pasteurized dairy products, rather than the real risks to dairy and poultry workers.

Measles is not a forgiving virus. It moves so quickly that it can capitalize on any defensive wobbles or holes in protection. As childhood-vaccination rates continue to lag and the nation’s leaders continue to dismiss data and undermine scientific rigor, experts worry that outbreaks such as these—and the country’s muddled responses to them—will become a deadly norm. Global rates of measles are rising, giving the virus more opportunities to slip into the United States. At the same time, the percentage of American children potentially susceptible to measles has grown in recent years, Bednarczyk’s research has shown. When more sparks hit more kindling, conflagrations will grow. Just over two months into 2025, the U.S. has already logged more than 150 measles cases—more than half of the total cases documented in all of 2024. If the U.S. has any hope of containing this crisis—and the ones that will surely follow—it’ll have to succeed at concentrating its resources on those most at risk.

Spared by DOGE—For Now

The Atlantic

www.theatlantic.com › health › archive › 2025 › 02 › epidemic-intelligence-service-doge-layoffs › 681771

Americans have plenty to worry about these days when it comes to infectious-disease outbreaks. This is the worst flu season in 15 years, there’s a serious measles outbreak roiling Texas, and the threat of bird flu isn’t going away. “The house is on fire,” Denis Nash, an epidemiologist at CUNY School of Public Health, told me. The more America is pummeled by disease, the greater the chance of widespread outbreaks and even another pandemic.

As of this week, the federal government may be less equipped to deal with these threats. Elon Musk’s efforts to shrink the federal workforce have hit public-health agencies, including the CDC, NIH, and FDA. The Trump administration has not released details on the layoffs, but the cuts appear to be more than trivial. The CDC lost an estimated 700 people, according to the Associated Press. Meanwhile, more than 1,000 NIH staffers reportedly lost their jobs.

Perhaps as notable as who was laid off is who wasn’t. The Trump administration initially seemed likely to target the CDC’s Epidemic Intelligence Service, a cohort of doctors, scientists, nurses, and even veterinarians who investigate and respond to disease outbreaks around the world. Throughout the program’s history, EIS officers have been the first line of defense against anthrax, Ebola, smallpox, polio, E. coli, and, yes, bird flu. Four recent CDC directors have been part of the program.

The layoffs were mostly based on workers’ probationary status. (Most federal employees are considered probationary in their first year or two on the job, and recently promoted staffers can also count as probationary.) EIS fellows typically serve two-year stints, which makes them probationary and thus natural targets for the most recent purge. EIS fellows told me they were bracing to be let go last Friday afternoon, but the pink slips never came. Exactly why remains unclear. In response to backlash about the planned firings, Musk posted on X on Monday that EIS is “not canceled” and that those suggesting otherwise should “stop saying bullshit.” A spokesperson for DOGE did not respond to multiple requests for comment.

This doesn’t mean EIS is safe. Both DOGE and Robert F. Kennedy Jr., Donald Trump’s newly confirmed health secretary, are just getting started. More layoffs could still be coming, and significant cuts to EIS would send a clear message that the administration does not believe that investigating infectious-disease outbreaks is a good use of tax dollars. In that way, the future of EIS is a barometer of how seriously the Trump administration takes the task of protecting public health.

Trump and his advisers have made it abundantly clear that, after the pandemic shutdowns in 2020, they want a more hands-off approach to dealing with outbreaks. Both Trump and Kennedy have repeatedly downplayed the destruction caused by COVID. But so far, the second Trump administration’s approach to public health has been confusing. Last year, Trump said he would close the White House’s pandemic office; now he is reportedly picking a highly qualified expert to lead it. The president hasn’t laid out a bird-flu plan, but amid soaring egg prices, the head of his National Economic Council recently said that the plan is coming. Kennedy has also previously said that he wants to give infectious-disease research a “break” and focus on chronic illness; in a written testimony during his confirmation hearings, he claimed that he wouldn’t actually do anything to reduce America’s capacity to respond to outbreaks.

The decision to spare EIS, at least for now, only adds to the confusion. (Nor is it the sole murky aspect of the layoffs: Several USDA workers responding to bird flu were also targeted, although the USDA told me that those cuts were made in error and that it is working to “rectify the situation.”) On paper, EIS might look like a relatively inconsequential training program that would be apt for DOGEing. In reality, the program is less like a cushy internship and more akin to public health’s version of the CIA.

Fellows are deployed around the world to investigate, and hopefully stop, some of the world’s most dangerous pathogens. The actual work of an EIS officer varies depending on where they’re deployed, though the program’s approach is often described as “shoe-leather epidemiology”—going door to door or village to village probing the cause of an illness in the way a New York City detective might investigate a stabbing on the subway. Fellows are highly credentialed experts, but the process provides hands-on training in how to conduct an outbreak investigation, according to Nash, the CUNY professor, who took part in the program. Nash entered EIS with a Ph.D. in epidemiology, but “none of our training could prepare us for the kinds of things we would learn through EIS,” he said.

In many cases, EIS officials are on the ground investigating before most people even know there’s a potential problem. An EIS officer investigated and recorded the United States’ first COVID case back in January 2020, when the virus was still known as 2019-nCoV. It would be another month before the CDC warned that the virus would cause widespread disruption to American life.

More recently, in October, EIS officers were on the ground in Washington when the state was hit with its first human cases of bird flu, Roberto Bonaccorso, a spokesperson with the Washington State Department of Health, wrote to me. “Every single outbreak in the United States and Washington State requires deployment of our current EIS officers,” Bonaccorso said.

EIS is hardly the only tool the federal government uses to protect the country against public-health threats. Managing an outbreak requires coordination across an alphabet soup of agencies and programs; an EIS fellow may have investigated the first COVID case, but that of course didn’t stop the pandemic from happening. Other vital parts of how America responds to infectious diseases were not spared by the DOGE layoffs. Two training programs with missions similar to that of EIS were affected by the cuts, according to a CDC employee whom I agreed not to identify by name because the staffer is not authorized to talk to the press.

The DOGE website boasts of saving nearly $4 million on the National Immunization Surveys, collectively one of the nation’s key tools for tracking how many Americans, particularly children, are fully vaccinated. What those cuts will ultimately mean for the future of the surveys is unknown. A spokesperson for the research group that runs the surveys, the National Opinion Research Center, declined to comment and directed all questions to the CDC.

And more cuts to the nation’s public-health infrastructure, including EIS, could be around the corner. RFK Jr. has already warned that certain FDA workers should pack their bags. Kennedy has repeatedly claimed that public-health officials inflate the risks of infectious disease threats to bolster their importance with the public; EIS fellows are the first responders who hit the ground often before public officials are even sounding the alarm bells.

Ironically, the work of the EIS is poised to become especially pressing during Trump’s second term. If measles, bird flu, or any other infectious disease begins spreading through America unabated after we have fired the public-health workforce, undermined vaccines, or halted key research, it will be the job of EIS fellows to figure out what went wrong.

The Return of Snake Oil

The Atlantic

www.theatlantic.com › health › archive › 2025 › 01 › patent-medicine-supplements-rfk-trump › 681515

In a Massachusetts cellar in 1873, Lydia Pinkham first brewed the elixir that would make her famous. The dirt-brown liquid, made from herbs including black cohosh and pleurisy root, contained somewhere between 18 and 22 percent alcohol—meant as a preservative, of course. Within a couple of years, Pinkham was selling her tonic at $1 a bottle to treat “women’s weaknesses.” Got the blues? How about inflammation, falling of the womb, or painful menstruation? Lydia E. Pinkham’s Vegetable Compound was the solution. Pinkham’s matronly smile, printed on labels and advertisements, became as well known as Mona Lisa’s.

Lydia E. Pinkham’s Vegetable Compound was one of thousands of popular and lucrative patent medicines—health concoctions dreamed up by chemists, housewives, and entrepreneurs—that took the United States by storm in the 19th and early 20th centuries. These products promised to treat virtually any ailment and didn’t have to reveal their recipes. Many contained alcohol, cocaine, morphine, or other active ingredients that ranged from dubious to dangerous. Dr. Guild’s Green Mountain Asthmatic Compound was available in cigarette form and included the poisonous plant belladonna. Early versions of Wampole’s Vaginal Cones, sold as a vaginal antiseptic and deodorizer, contained picric acid, a toxic compound used as an explosive during World War I. Patent-medicine advertisements were unavoidable; by the 1870s, 25 percent of all advertising was for patent medicines.

After the Pure Food and Drug Act was passed in 1906, the newly created Food and Drug Administration cracked down on miracle elixirs. But one American industry is still keeping the spirit of patent medicine alive: dietary supplements. In the U.S., vitamins, botanicals, and other supplements are minimally regulated. Some can improve people’s health or address specific conditions, but many, like the medicines of old, contain untested or dangerous ingredients. Nevertheless, three-quarters of Americans take at least one. Some take far more. Robert F. Kennedy Jr., the longtime conspiracy theorist and anti-vaccine activist who’s awaiting Senate confirmation to run the Department of Health and Human Services, has said he takes a “fistful” of vitamins each day. Kennedy has in recent years championed dietary supplements and decried their “suppression” by the FDA—an agency he would oversee as health secretary. Now he’s poised to bring America’s ever-growing supplement enthusiasm to the White House and supercharge the patent-medicine revival.  

The newly created FDA eventually required all pharmaceutical drugs—substances intended for use in the diagnosis, cure, mitigation, treatment, or prevention of disease—to be demonstrably safe and effective before they could be sold. But dietary supplements, as we call them now, were never subject to that degree of scrutiny. Vitamins were sold with little interference until the “megadosing” trend of the late 1970s and ’80s, which began after the chemist Linus Pauling started claiming that large amounts of vitamin C could stave off cancer and other diseases. The FDA announced its intention to regulate vitamins, but the public (and the supplement industry) revolted. Mel Gibson starred in a television ad in which he was arrested at home for having a bottle of Vitamin C, and more than 2.5 million people participated in a “Save Our Supplements” letter-writing campaign. Congress stepped in, passing the 1994 Dietary Supplement Health and Education Act, which officially exempted dietary supplements from the regulations that medications are subject to.

Since then, the FDA has generally not been responsible for any premarket review of dietary supplements, and manufacturers have not usually had to reveal their ingredients. “It’s basically an honor system where manufacturers need to declare that their products are safe,” says S. Bryn Austin, a social epidemiologist and behavioral scientist at the Harvard T. H. Chan School of Public Health. The agency will get involved only if something goes wrong after the supplement starts being sold. As long as they disclose that the FDA hasn’t evaluated their claims, and that those claims don’t involve disease, supplement makers can say that their product will do anything to the structure or function of the body. You can say that a supplement improves cognition, for example, but not that it treats ADHD. These claims don’t have to be supported with any evidence in humans, animals, or petri dishes.

In 1994, the dietary-supplement industry was valued at $4 billion. By 2020, it had ballooned to $40 billion. Patent-medicine creators once toured their products in traveling medicine shows and made trading cards that people collected, exchanged, and pasted into scrapbooks; today, supplement companies sponsor popular podcasts, Instagram stories are overrun with supplement ads, and influencers make millions selling their own branded supplements. The combination of modern wellness culture with lax regulations has left Americans with 19th-century-like problems: Pieter Cohen, an associate professor of medicine at Cambridge Health Alliance, has found a methamphetamine analogue in a workout supplement, and omberacetam, a Russian drug for traumatic brain injuries and mood disorders, in a product marketed to help with memory.

Last year, Kennedy accused the FDA of suppressing vitamins and other alternative health products that fall into the dietary-supplement category. But “there is no truth about the FDA being at war on supplements over the last several decades,” Cohen told me. “In fact, they have taken an extremely passive, inactive approach.” Experts have repeatedly argued that the FDA needs more authority to investigate and act on supplements, not less. And yet, Kennedy continues to champion the industry. He told the podcaster Lex Fridman that he takes so many vitamins, “I couldn’t even remember them all.” Kennedy has vocally opposed additives in food and conflicts of interest in the pharmaceutical industry, but has failed to mention the dangerous additives in dietary supplements and the profits to be made in the supplement market. (Neither Kennedy nor a representative from the MAHA PAC responded to a request for comment.)

In an already permissive environment, Kennedy’s confirmation could signal to supplement manufacturers that anything goes, Cohen said. If the little regulation that the FDA is responsible for now—surveilling supplements after they’re on the market—lapses, more adulterated and mislabeled supplements could line store shelves. And Americans might well pour even more of our money into the industry, egged on by the wellness influencer charged with protecting our health and loudly warning that most of our food and drug supply is harmful. Kennedy might even try to get in on the supplement rush himself. Yesterday, The Washington Post reported that, according to documents filed to the U.S. Patent and Trademark Office, Kennedy applied to trademark MAHA last year, which would allow him to sell, among other things, MAHA-branded supplements and vitamins. (He transferred ownership of the application to an LLC in December. Kennedy’s team did not respond to the Post.)

A truly unleashed supplement industry would have plenty of tools at its disposal with which to seduce customers. Austin studies dietary supplements that make claims related to weight loss, muscle building, “cleansing,” and detoxing, many of which are marketed to not just adults, but teenagers too. “Those types of products, in particular, play on people’s insecurities,” she told me. They also purport to ease common forms of bodily or mental distress that can’t be quickly addressed by traditional medical care. Reducing stress is hard, but ordering the latest cortisol-reducing gummy on TikTok Shop is easy. Your doctor can’t force vegetables into your diet, but a monthly subscription of powdered greens can.

Judy Z. Segal, a professor emerita at the University of British Columbia who has analyzed patent-medicine trading cards from the 19th and 20th centuries, told me that supplement-marketing strategies “have not changed that much since the patent-medicine era.” Patent medicines appealed to ambient, relatable complaints; one ad for Burdock’s Blood Bitters asserted that there were “thousands of females in America who suffer untold miseries from chronic diseases common to their sex.” And the makers of patent medicine, like many modern supplement companies, used friendly spokespeople and customer testimonials while positioning their products as preventive care; according to one ad for Hartshorn’s Sarsaparilla, “The first deviation from perfect health should receive attention.”

In 1905, the muckraker Samuel Hopkins Adams lamented that “gullible America” was so eager to “swallow huge quantities of alcohol, an appalling amount of opiates and narcotics, a wide assortment of varied drugs ranging from powerful and dangerous heart depressants to insidious liver stimulants; and, far in excess of all other ingredients, undiluted fraud.” Compounds and elixirs go by different names now—nootropics, detoxes, adaptogens—but if Adams walked down any supplement aisle or browsed Amazon, he’d still find plenty of cure-alls. He could even pick up a bottle of Lydia E. Pinkham’s Herbal Supplement, which is sold as an aid for menstruation and menopause. Pinkham’s face smiles at buyers from the label, though its advertised benefits are now accompanied by a tiny disclaimer: “This statement has not been evaluated by the FDA.”

America Wouldn’t Know the Worst of a Vaccine Decline Until It’s Too Late

The Atlantic

www.theatlantic.com › health › archive › 2025 › 01 › rfk-jr-vaccine-decline › 681489

Becoming a public-health expert means learning how to envision humanity’s worst-case scenarios for infectious disease. For decades, though, no one in the U.S. has had to consider the full danger of some of history’s most devastating pathogens. Widespread vaccination has eliminated several diseases—among them, measles, polio, and rubella—from the country, and kept more than a dozen others under control. But in the past few years, as childhood-vaccination rates have dipped nationwide, some of infectious disease’s ugliest hypotheticals have started to seem once again plausible.

The new Trump administration has only made the outlook more tenuous. Should Robert F. Kennedy Jr., one of the nation’s most prominent anti-vaccine activists, be confirmed as the next secretary of Health and Human Services, for instance, his actions could make a future in which diseases resurge in America that much more likely. His new position would grant him substantial power over the FDA and the CDC, and he is reportedly weighing plans—including one to axe a key vaccine advisory committee—that could prompt health-care providers to offer fewer shots to kids, and inspire states to repeal mandates for immunizations in schools. (Kennedy’s press team did not respond to a request for comment.)

Kennedy’s goal, as he has said, is to offer people more choice, and many Americans likely would still enthusiastically seek out vaccines. Most Americans support childhood vaccination and vaccine requirements for schools; a KFF poll released today found, though, that even in the past year the proportion of parents who say they skipped or delayed shots for their children has risen, to one in six. The more individuals who choose to eschew vaccination, the closer those decisions would bring society’s collective defenses to cracking. The most visceral effects might not be obvious right away. For some viruses and bacteria to break through, the country’s immunization rates may need to slip quite a bit. But for others, the gap between no outbreak and outbreak is uncomfortably small. The dozen experts I spoke with for this story were confident in their pessimism about how rapidly epidemics might begin.

[Read: How America’s fire wall against disease starts to fail]

Paul Offit, a pediatrician at Children’s Hospital of Philadelphia and co-inventor of one of the two rotavirus vaccines available in the U.S., needs only to look at his own family to see the potential consequences. His parents were born into the era of the deadly airway disease diphtheria; he himself had measles, mumps, rubella, and chickenpox, and risked contracting polio. Vaccination meant that his own kids didn’t have to deal with any of these diseases. But were immunization rates to fall too far, his children’s children very well could. Unlike past outbreaks, those future epidemics would sweep across a country that, having been free of these diseases for so long, is no longer equipped to fight them.

“Yeah,” Offit said when I asked him to paint a portrait of a less vaccinated United States. “Let’s go into the abyss.”

Should vaccination rates drop across the board, one of the first diseases to be resurrected would almost certainly be measles. Experts widely regard the viral illness, which spreads through the air, as the most infectious known pathogen. Before the measles vaccine became available in 1963, the virus struck an estimated 3 million to 4 million Americans each year, about 1,000 of whom would suffer serious swelling of the brain and roughly 400 to 500 of whom would die. Many survivors had permanent brain damage. Measles can also suppress the immune system for years, leaving people susceptible to other infections.

Vaccination was key to ridding the U.S. of measles, declared eliminated here in 2000. And very high rates of immunity—about 95 percent vaccine coverage, experts estimate—are necessary to keep the virus out. “Just a slight dip in that is enough to start spurring outbreaks,” Boghuma Kabisen Titanji, an infectious-disease physician at Emory University, told me. Which has been exactly the case. Measles outbreaks do still occur in American communities where vaccination rates are particularly low, and as more kids have missed their MMR shots in recent years, the virus has found those openings. The 16 measles outbreaks documented in the U.S. in 2024 made last year one of the country’s worst for measles since the turn of the millennium.

But for all measles’ speed, “I would place a bet on whooping cough being first,” Samuel Scarpino, an infectious-disease modeler at Northeastern University, told me. The bacterial disease can trigger months of coughing fits violent enough to fracture ribs. Its severest consequences include pneumonia, convulsions, and brain damage. Although slower to transmit than measles, it has never been eliminated from the U.S., so it’s poised for rampant spread. Chickenpox poses a similar problem. Although corralled by an effective vaccine in the 1990s, the highly contagious virus still percolates at low levels through the country. Plenty of today’s parents might still remember the itchy blisters it causes as a rite of passage, but the disease’s rarer complications can be as serious as sepsis, uncontrolled bleeding, and bacterial infections known as “flesh-eating disease.” And the disease is much more serious in older adults.

Those are only some of the diseases the U.S. could have to deal with. Kids who get all of the vaccines routinely recommended in childhood are protected against 16 diseases—each of which would have some probability of making a substantial comeback, should uptake keep faltering. Perhaps rubella would return, infecting pregnant women, whose children could be born blind or with heart defects. Maybe meningococcal disease, pneumococcal disease, or Haemophilus influenzae disease, each caused by bacteria commonly found in the airway, would skyrocket, and with them rates of meningitis and pneumonia. The typical ailments of childhood—day-care colds, strep throat, winter norovirus waves—would be joined by less familiar and often far more terrifying problems: the painful, swollen necks of mumps; the parching diarrhea of rotavirus; the convulsions of tetanus. For far too many of these illnesses, “the only protection we have,” Stanley Plotkin, a vaccine expert and one of the developers of the rubella vaccine, told me, “is a vaccine.”

Exactly how and when outbreaks of these various diseases could play out—if they do at all—is impossible to predict. Vaccination rates likely wouldn’t fall uniformly across geographies and demographics. They also wouldn’t decrease linearly, or even quickly. People might more readily refuse vaccines that were developed more recently and have been politicized (think HPV or COVID shots). And existing immunity could, for a time, still buffer against an infectious deluge, especially from pathogens that remain quite rare globally. Polio, for instance, would be harder than measles to reestablish in the United States: It was declared eliminated from the Americas in the 1990s, and remains endemic to only two countries. This could lead to a false impression that declining vaccination rates have little impact.

A drop in vaccination rates, after all, doesn’t guarantee an outbreak—a pathogen must first find a vulnerable population. This type of chance meeting could take years. Then again, infiltrations might not take long in a world interconnected by travel. The population of this country is also more susceptible to disease than it has been in past decades. Americans are, on average, older; obesity rates are at a historical high. The advent of organ transplants and cancer treatments has meant that a substantial sector of the population is immunocompromised; many other Americans are chronically ill. Some of these individuals don’t mount protective responses to vaccinations at all, which leaves them reliant on immunity in others to keep dangerous diseases at bay.

If various viruses and bacteria began to recirculate in earnest, the chance of falling ill would increase even for healthy, vaccinated adults. Vaccines don’t offer comprehensive or permanent protection, and the more pathogen around, the greater its chance of breaking through any one person’s defenses. Immunity against mumps and whooping cough is incomplete, and known to wane in the years after vaccination. And although immunity generated by the measles vaccine is generally thought to be quite durable, experts can’t say for certain how durable, Bill Hanage, an infectious-disease epidemiologist at Harvard’s School of Public Health, told me: The only true measure would be to watch the virus tear through a population that hasn’t dealt with it in decades.

Perhaps the most unsettling feature of a less vaccinated future, though, is how unprepared the U.S. is to confront a resurgence of pathogens. Most health-care providers in the country no longer have the practical knowledge to diagnose and treat diseases such as measles and polio, Kathryn Edwards, a pediatrician at Vanderbilt University, told me: They haven’t needed it. Many pediatricians have never even seen chickenpox outside of a textbook.

To catch up, health-care providers would need to familiarize themselves with signs and symptoms they may have seen only in old textbooks or in photographs. Hospitals would need to use diagnostic tests that haven’t been routine in years. Some of those tools might be woefully out of date, because pathogens have evolved; antibiotic resistance could also make certain bacterial infections more difficult to expunge than in decades prior. And some protocols may feel counterintuitive, Offit said: The ultra-contagiousness of measles could warrant kids with milder cases being kept out of health-care settings, and kids with Haemophilus influenzae might need to be transported to the hospital without an ambulance, to minimize the chances that the stress and cacophony would trigger a potentially lethal spasm.

[Read: Here’s how we know RFK Jr. is wrong about vaccines]

The learning curve would be steep, Titanji said, stymieing care for the sick. The pediatric workforce, already shrinking, might struggle to meet the onslaught, leaving kids—the most likely victims of future outbreaks—particularly susceptible, Sallie Permar, the chief pediatrician at NewYork–Presbyterian/Weill Cornell Medical Center, told me. If already overstretched health-care workers were further burdened, they’d be more likely to miss infections early on, making those cases more difficult to treat. And if epidemiologists had to keep tabs on more pathogens, they’d have less capacity to track any single infectious disease, making it easier for one to silently spread.

The larger outbreaks grow, the more difficult they are to contain. Eventually, measles could once again become endemic in the U.S. Polio could soon follow suit, imperiling the fight to eradicate the disease globally, Virginia Pitzer, an infectious-disease epidemiologist at Yale, told me. In a dire scenario—the deepest depths of the abyss—average lifespans in the U.S. could decline, as older people more often fall sick, and more children under 5 die. Rebottling many of these diseases would be a monumental task. Measles was brought to heel in the U.S. only by decades of near-comprehensive vaccination; re-eliminating it from the country would require the same. But the job this time would be different, and arguably harder—not merely coaxing people into accepting a new vaccine, but persuading them to take one that they’ve opted out of.

That future is by no means guaranteed—especially if Americans recall what is at stake. Many people in this country are too young to remember the cost these diseases exacted. But Edwards, who has been a pediatrician for 50 years, is not. As a young girl, she watched a childhood acquaintance be disabled by polio. She still vividly recalls patients she lost to meningitis decades ago. The later stages of her career have involved fewer spinal taps, fewer amputations. Because of vaccines, the job of caring for children, nowadays, simply involves far less death.