Itemoids

Yale

January 6 Still Happened

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › january-6-trump-history › 681647

A month after the January 6 insurrection, a page appeared on the Justice Department’s website naming the defendants charged for their alleged role in the Capitol riot. The list remained in place over the next four years, ballooning as the department brought charges against hundreds of people. Then, shortly after Donald Trump’s second inauguration, it vanished.

Trump has seized on his reelection as an opportunity to rewrite the story of January 6. Just hours after he assumed the presidency, he granted pardons and commutations to the insurrectionists who broke into the Capitol, calling their prosecutions a “grave national injustice.” The deletion of the Justice Department’s page on January 6 is a triumph for the insurrectionists whose crimes were erased, and for Republicans more generally, many of whom would simply rather not talk about the late unpleasantness.

But in the long term, the truth of what happened will prove difficult to bury. The roughly six hours during which rioters breached the Capitol are some of the most exhaustively documented in recent history, thanks to the many participants who filmed themselves in action and the investigative efforts of both the Justice Department and the House January 6 committee. Even Trump can’t wipe that away.

[Read: Republican leaders once thought January 6 was ‘tragic’]

Trump’s proclamation announcing the January 6 pardons portrays the grant of clemency as the beginning of a “process of national reconciliation,” a parody of the typical language of presidential mercy. He has demanded that the Justice Department drop all ongoing investigations into rioters not yet charged and placed the office that had carried out those prosecutions under the control of Ed Martin, a former “Stop the Steal” organizer who himself tweeted that he was at the Capitol the day of the insurrection. Days into Trump’s second term, when the president attended a rally in Las Vegas, standing behind him was Stewart Rhodes—the leader of the Oath Keepers and a prominent presence on January 6, who had just received a commutation of his 18-year sentence for seditious conspiracy.

In this upside-down version of January 6, the prosecutions were the crime, not the coup attempt. And, despite Trump’s smug assertion of “reconciliation,” his administration is now retaliating against the civil servants who played a role in prosecuting the insurrectionists. The Justice Department has fired 15 low-level prosecutors who worked on the January 6 cases, along with officials assigned to Special Counsel Jack Smith’s investigations of Trump. Thousands of FBI employees who worked on the January 6 investigations—by many metrics, the largest investigative effort in the bureau’s history—are also waiting to discover whether they, too, will be purged.

The disappearance of the Justice Department’s page on the insurrection, which had expanded to include not just information on defendants and charges but also a growing list of convictions and criminal sentences, was a particularly blunt metaphor for this erasure of history. On January 27, the page was replaced with a “Page not found” message. “This is a huge victory for J6ers,” Brandon Straka, a pro-Trump social-media influencer who himself received a pardon for his role on January 6, wrote on X. “This site was one of countless weapons of harassment used by the federal government to make life impossible for its targets from J6.”

[From the November 2023 issue: The patriot]

This is the politics of forgetting, and the United States is no stranger to it. David Blight, an American-history professor at Yale, has argued that January 6 is a novel twist on the “Lost Cause”—the Confederate narrative of noble sacrifice that fueled successful white resistance to multiracial democracy in the years after the Civil War. The original Lost Cause strengthened into a racist political force over decades. When I reached out to Blight to discuss the comparison, he seemed unnerved by how quickly the memory of January 6 had shifted toward revisionism. “We’re in an unusual moment where evidence doesn’t seem to make any difference,” he told me. “It’s in that world that January 6 is being processed as a historical marker.”

But that evidence does still exist. And among the dissenters to this enforced forgetting are the people who have spent the most time reviewing it: the judges. Trump’s actions “will not change the truth of what happened on January 6, 2021,” Judge Colleen Kollar-Kotelly wrote, reluctantly acknowledging that she had no ability to block the Justice Department from tossing out a January 6 defendant’s case. “What occurred that day is preserved for the future through thousands of contemporaneous videos, transcripts of trials, jury verdicts, and judicial opinions analyzing and recounting the evidence through a neutral lens.”

And it will be preserved, because those documents are not under the control of the Trump administration. PACER, the public system used by the federal courts to file legal documents, is run by the judicial branch. There is no mechanism through which a vengeful president can bar access to the thousands upon thousands of pages of court filings docketed in the hundreds and hundreds of charged January 6 cases—including the case against Trump. Days after the Justice Department deleted the database of rioters from its website, one judge in a January 6 case used his order dismissing the charges to memorialize the same resource that the department had scrubbed, attaching the almost 140-page spreadsheet as an appendix. That material is all publicly available, and it is not going away, whatever Trump says about injustice or reconciliation.

The same is true of the House January 6 committee’s work—the hours of hearings convened and broadcast as well as the nearly 900-page final report and its extensive compilation of evidence and depositions. It’s all freely accessible on the website of the Government Publishing Office, a legislative agency, and easily downloaded by anyone who wants to keep ahold of it. Outside the government, ProPublica retains an extensive database of videos posted to the defunct social-media platform Parler by rioters on January 6, documenting the siege of the Capitol minute by minute. NPR hosts a database of defendants that reproduces the information the Justice Department tried to delete.

There are more guerrilla-style efforts at archiving, too. As the NBC reporter Ryan J. Reilly has documented, the January 6 investigation was shaped by the volunteer efforts of ordinary people who mobilized online to sort through video and social-media posts and send tips to the FBI. Now some of that same energy has turned toward preserving the record. I spoke with one person who participated in those early crowdsourcing projects and is now maintaining a network of bare-bones websites where visitors can access a range of January 6 material, including court documents and videos posted on social media. Another collective is working to save video evidence on Archive.org and download the full spread of court documents. At the time I reached out, this group estimated that about 50,000 pages had been preserved this way so far.

[Read: The January 6er who left Trumpism]

Struggles over historical memory are ultimately “about the power of the story and who gets to control it,” Blight told me, rather than the strength of the facts. And for those who were attacked or threatened on January 6, or who have faced attacks since for their efforts to uncover what happened and bring the perpetrators to justice, the sudden revision of the story without regard for facts has done its own damage. “I get so many messages, ‘Harry, you’re a hero.’ I don’t want to be a hero,” Harry Dunn, a Capitol Police officer who protected Congress on January 6, told The New York Times. “I want accountability.”

Still, the existence of a robust historical record can eventually make a difference. Blight pointed to the white-supremacist coup in Wilmington, North Carolina, in 1898, the true violence of which was ignored for nearly a century until scholars began looking through the archive and publishing their findings. Today, it is widely recognized for what it was: a successful assault on multiracial democracy, carried out by a violent mob.

“You can almost predict that with that kind of evidence, as long as it’s not suppressed or destroyed,” historians will one day be able to tell the truth of what happened, Blight said. As the canard goes, history may be written by the victors. But in the long term—perhaps the very long term—it is also written by the people who kept the documents.

What Will Happen If the Trump Administration Defies a Court Order?

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › legal-analysis-trump-ignores-court › 681672

Sign up for Trump’s Return, a newsletter featuring coverage of the second Trump presidency.

Throughout everything that happened during Donald Trump’s first term in office—the abuses of executive power, the impeachments, the attack on the U.S. Capitol on January 6, 2021—the administration never outright defied an order of the court. Now, less than a month into Trump’s second term, the president and those around him seem to be talking themselves into crossing that line.

The crisis began—where else?—on X, where the administration’s unelected chancellor Elon Musk began spitefully posting about a court order limiting the ability of his aides to rampage through sensitive payment systems at the Treasury Department. Within the locked, echoing room of the X algorithm, Musk’s outrage bounced among far-right influencers and sympathetic members of the legal academy until it found the ear of Vice President J. D. Vance, who posted on Sunday: “Judges aren’t allowed to control the executive’s legitimate power.”

Vance’s post is somewhat tricky. The vice president didn’t say outright that the administration would defy a court order, but he hinted at it by implicitly raising the question of just who determines what constitutes a legitimate use of executive authority. Is it the executive branch itself, or the courts? Since the Supreme Court handed down Marbury vs. Madison in 1803, the answer has emphatically been the latter. But if the Trump administration decides that the president himself—or Elon Musk—gets to choose whether or not to obey the courts, then the country may cross into dangerous and unknown territory. Legal scholars can’t agree on just what defines a constitutional crisis, but pretty much everyone would recognize intentional executive defiance of a court order as one.

[Read: Trump signals he might ignore the courts]

The good news, such as it is, is that the administration doesn’t yet seem to have taken the plunge. The bad news is that this seems like a live possibility, and nobody really knows what will happen if it does. To some extent, there is a road map—but beyond that, not so much.

Already, the cascade of litigation against Trump’s executive actions has resulted in several instances in which courts have scolded the administration for noncompliance. Most notably, almost two weeks after Judge John McConnell ordered the administration to halt its broad freeze of trillions of dollars in federal funds, 22 Democratic attorneys general filed a motion to enforce compliance with the order, alerting the court that funding for many state programs remained halted. The Justice Department responded that it had abided by its own, narrower reading of the temporary restraining order. Judge McConnell swiftly issued another order declaring the federal government to have violated the terms of his initial ruling, demanding that it comply with the more expansive reading of his order going forward—and hinting at the possibility of legal penalties if the administration defied him.

I will admit to watching these proceedings unfold with a pit in my stomach, waiting for Musk, Vance, and Trump to spin themselves up into outright disobedience. So far, though, that hasn’t happened. Instead, the Justice Department appealed Judge McConnell’s order to the U.S. Court of Appeals for the First Circuit—which, despite some procedural oddities, is the normal rule-of-law process for when the government doesn’t like a court order and wants to change it. (The First Circuit denied the appeal.) An ongoing scuffle over whether a certain stream of FEMA funding could be turned off under Judge McConnell’s order has not so far resulted in J. D. Vance tweeting “Come and take it!” Rather, the Justice Department filed requests for clarification from the court about the scope of the order, which the judge provided. Likewise, rather than just disobeying the temporary restraining order that bothered Musk so greatly concerning access to Treasury systems, the Justice Department requested and received a limited carve-out from the court.

None of this is good, but it’s not outright defiance. As the legal journalist Chris Geidner has written, “DOJ lawyers do appear to be seeking a way to advance Trump’s claims in courts while trying to then implement courts’ orders if and when those claims fail.” It’s important that these cases are being litigated by Justice Department attorneys who don’t want to get in trouble with the courts or legal bar authorities for lying or disobeying an order, and have strong incentives to play by the rules. Elon Musk may not care, but lawyers need to worry about their ability to practice law—under future administrations as well.

But what happens if the Trump team decides to push things further? Take the funding-freeze case again—if the standoff continued, the judge might convene a hearing, or plaintiffs could push for one, to determine why the court shouldn’t hold the government in contempt. What then?

Federal courts have broad powers to hold those who defy their orders in contempt. This can take the form of financial penalties or even incarceration, either to strong-arm the contemner into compliance or to punish them for noncompliance after the fact. Those financial penalties can be steep. In one extreme 2014 case, the Foreign Intelligence Surveillance Court hinted at its willingness to impose fines of $250,000 on Yahoo for noncompliance with a government surveillance program—an amount that would have doubled every week, quickly bankrupting the company. (Yahoo complied.)

[Peter M. Shane: Presidents may not unilaterally dismantle government agencies]

Yet a broad survey of litigation by Nicholas R. Parrillo, a law professor at Yale, reveals that federal courts have in many cases been reluctant to turn the screws when the federal government itself is the party that might be held in contempt. Instead, Parrillo writes, courts have tended to wield the threat of contempt—relying on the norm that executive officials generally don’t want to be found in violation of a court order. But that norm is exactly what Trump and those around him are now toying with trying to erode.

If a court did try to levy sanctions against a defiant official or agency, that would also bring up the question of who would enforce them. The agency responsible for judicial enforcement is the U.S. Marshals, which is under the control of the Justice Department. By statute, marshals are required to carry out court orders. But while we’re spinning out hypotheticals, what would happen if Attorney General Pam Bondi, or Trump himself, ordered them not to comply?

The answer to that question lies outside the courtroom. It is located instead in the halls of Congress, the pages of newspapers, the boardrooms of businesses and civil-society organizations, and finally the streets. It’s not a struggle that can be resolved by law itself, but rather by whether Americans care enough to demonstrate as a polity that the rule of law matters to them and that they will defend it.

Why the COVID Deniers Won

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 03 › covid-deniers-anti-vax-public-health-politics-polarization › 681435

Five years ago, the coronavirus pandemic struck a bitterly divided society.

Americans first diverged over how dangerous the disease was: just a flu (as President Donald Trump repeatedly insisted) or something much deadlier.

Then they disputed public-health measures such as lockdowns and masking; a majority complied while a passionate minority fiercely resisted.

Finally, they split—and have remained split—over the value and safety of COVID‑19 vaccines. Anti-vaccine beliefs started on the fringe, but they spread to the point where Ron DeSantis, the governor of the country’s third-most-populous state, launched a campaign for president on an appeal to anti-vaccine ideology.

Five years later, one side has seemingly triumphed. The winner is not the side that initially prevailed, the side of public safety. The winner is the side that minimized the disease, then rejected public-health measures to prevent its spread, and finally refused the vaccines designed to protect against its worst effects.

[David A. Graham: The noisy minority]

Ahead of COVID’s fifth anniversary, Trump, as president-elect, nominated the country’s most outspoken vaccination opponent to head the Department of Health and Human Services. He chose a proponent of the debunked and discredited vaccines-cause-autism claim to lead the CDC. He named a strident critic of COVID‑vaccine mandates to lead the FDA. For surgeon general, he picked a believer in hydroxychloroquine, the disproven COVID‑19 remedy. His pick for director of the National Institutes of Health had advocated for letting COVID spread unchecked to encourage herd immunity. Despite having fast-tracked the development of the vaccines as president, Trump has himself trafficked in many forms of COVID‑19 denial, and has expressed his own suspicions that childhood vaccination against measles and mumps is a cause of autism.

The ascendancy of the anti-vaxxers may ultimately prove fleeting. But if the forces of science and health are to stage a comeback, it’s important to understand why those forces have gone into eclipse.

From March 2020 to February 2022, about 1 million Americans died of COVID-19. Many of those deaths occurred after vaccines became available. If every adult in the United States had received two doses of a COVID vaccine by early 2022, rather than just the 64 percent of adults who had, nearly 320,000 lives would have been saved.

[From the January/February 2021 issue: Ed Yong on how science beat the virus]

Why did so many Americans resist vaccines? Perhaps the biggest reason was that the pandemic coincided with a presidential-election year, and Trump instantly recognized the crisis as a threat to his chances for reelection. He responded by denying the seriousness of the pandemic, promising that the disease would rapidly disappear on its own, and promoting quack cures.

The COVID‑19 vaccines were developed while Trump was president. They could have been advertised as a Trump achievement. But by the time they became widely available, Trump was out of office. His supporters had already made up their minds to distrust the public-health authorities that promoted the vaccines. Now they had an additional incentive: Any benefit from vaccination would redound to Trump’s successor, Joe Biden. Vaccine rejection became a badge of group loyalty, one that ultimately cost many lives.

A summer 2023 study by Yale researchers of voters in Florida and Ohio found that during the early phase of the pandemic, self-identified Republicans died at only a slightly higher rate than self-identified Democrats in the same age range. But once vaccines were introduced, Republicans became much more likely to die than Democrats. In the spring of 2021, the excess-death rate among Florida and Ohio Republicans was 43 percent higher than among Florida and Ohio Democrats in the same age range. By the late winter of 2023, the 300-odd most pro-Trump counties in the country had a COVID‑19 death rate more than two and a half times higher than the 300 or so most anti-Trump counties.

In 2016, Trump had boasted that he could shoot a man on Fifth Avenue and not lose any votes. In 2021 and 2022, his most fervent supporters risked death to prove their loyalty to Trump and his cause.

Why did political fidelity express itself in such self-harming ways?

The onset of the pandemic was an unusually confusing and disorienting event. Some people who got COVID died. Others lived. Some suffered only mild symptoms. Others spent weeks on ventilators, or emerged with long COVID and never fully recovered. Some lost businesses built over a lifetime. Others refinanced their homes with 2 percent interest rates and banked the savings.

We live in an impersonal universe, indifferent to our hopes and wishes, subject to extreme randomness. We don’t like this at all. We crave satisfying explanations. We want to believe that somebody is in control, even if it’s somebody we don’t like. At least that way, we can blame bad events on bad people. This is the eternal appeal of conspiracy theories. How did this happen? Somebody must have done it—but who? And why?

Compounding the disorientation, the coronavirus outbreak was a rapidly changing story. The scientists who researched COVID‑19 knew more in April 2020 than they did in February; more in August than in April; more in 2021 than in 2020; more in 2022 than in 2021. The official advice kept changing: Stay inside—no, go outside. Wash your hands—no, mask your face. Some Americans appreciated and accepted that knowledge improves over time, that more will be known about a new disease in month two than in month one. But not all Americans saw the world that way. They mistrusted the idea of knowledge as a developing process. Such Americans wondered: Were they lying before? Or are they lying now?

In a different era, Americans might have deferred more to medical authority. The internet has upended old ideas of what should count as authority and who possesses it.

The pandemic reduced normal human interactions. Severed from one another, Americans deepened their parasocial attachment to social-media platforms, which foment alienation and rage. Hundreds of thousands of people plunged into an alternate mental universe during COVID‑19 lockdowns. When their doors reopened, the mania did not recede. Conspiracies and mistrust of the establishment—never strangers to the American mind—had been nourished, and they grew.

The experts themselves contributed to this loss of trust.

It’s now agreed that we had little to fear from going outside in dispersed groups. But that was not the state of knowledge in the spring of 2020. At the time, medical experts insisted that any kind of mass outdoor event must be sacrificed to the imperatives of the emergency. In mid-March 2020, federal public-health authorities shut down some of Florida’s beaches. In California, surfers faced heavy fines for venturing into the ocean. Even the COVID‑skeptical Trump White House reluctantly canceled the April 2020 Easter-egg roll.

And then the experts abruptly reversed themselves. When George Floyd was choked to death by a Minneapolis police officer on May 25, 2020, hundreds of thousands of Americans left their homes to protest, defying three months of urgings to avoid large gatherings of all kinds, outdoor as well as indoor.

On May 29, the American Public Health Association issued a statement that proclaimed racism a public-health crisis while conspicuously refusing to condemn the sudden defiance of public-safety rules.

The next few weeks saw the largest mass protests in recent U.S. history. Approximately 15 million to 26 million people attended outdoor Black Lives Matter events in June 2020, according to a series of reputable polls. Few, if any, scientists or doctors scolded the attendees—and many politicians joined the protests, including future Vice President Kamala Harris. It all raised a suspicion: Maybe the authorities were making the rules based on politics, not science.

The politicization of health advice became even more consequential as the summer of 2020 ended. Most American public schools had closed in March. “At their peak,” Education Week reported, “the closures affected at least 55.1 million students in 124,000 U.S. public and private schools.” By September, it was already apparent that COVID‑19 posed relatively little risk to children and teenagers, and that remote learning did not work. At the same time, returning to the classroom before vaccines were available could pose some risk to teachers’ health—and possibly also to the health of the adults to whom the children returned after school.

[David Frum: I moved to Canada during the pandemic]

How to balance these concerns given the imperfect information? Liberal states decided in favor of the teachers. In California, the majority of students did not return to in-person learning until the fall of 2021. New Jersey kept many of its public schools closed until then as well. Similar things happened in many other states: Illinois, Maryland, New York, and so on, through the states that voted Democratic in November 2020.

Florida, by contrast, reopened most schools in the fall of 2020. Texas soon followed, as did most other Republican-governed states. The COVID risk for students, it turned out, was minimal: According to a 2021 CDC study, less than 1 percent of Florida students contracted COVID-19 in school settings from August to December 2020 after their state restarted in-person learning. Over the 2020–21 school year, students in states that voted for Trump in the 2020 election got an average of almost twice as much in-person instruction as students in states that voted for Biden.

Any risks to teachers and school staff could have been mitigated by the universal vaccination of those groups. But deep into the fall of 2021, thousands of blue-state teachers and staff resisted vaccine mandates—including more than 5,000 in Chicago alone. By then, another school year had been interrupted by closures.

By disparaging public-health methods and discrediting vaccines, the COVID‑19 minimizers cost hundreds of thousands of people their lives. By keeping schools closed longer than absolutely necessary, the COVID maximizers hazarded the futures of young Americans.

Students from poor and troubled families, in particular, will continue to pay the cost of these learning losses for years to come. Even in liberal states, many private schools reopened for in-person instruction in the fall of 2020. The affluent and the connected could buy their children a continuing education unavailable to those who depended on public schools. Many lower-income students did not return to the classroom: Throughout the 2022–23 school year, poorer school districts reported much higher absenteeism rates than were seen before the pandemic.

Teens absent from school typically get into trouble in ways that are even more damaging than the loss of math or reading skills. New York City arrested 25 percent more minors for serious crimes in 2024 than in 2018. The national trend was similar, if less stark. The FBI reports that although crime in general declined in 2023 compared with 2022, crimes by minors rose by nearly 10 percent.

People who finish schooling during a recession tend to do worse even into middle age than those who finish in times of prosperity. They are less likely to marry, less likely to have children, and more likely to die early. The disparity between those who finish in lucky years and those who finish in unlucky years is greatest for people with the least formal education.

Will the harms of COVID prove equally enduring? We won’t know for some time. But if past experience holds, the COVID‑19 years will mark their most vulnerable victims for decades.

The story of COVID can be told as one of shocks and disturbances that wrecked two presidencies. In 2020 and 2024, incumbent administrations lost elections back-to-back, something that hadn’t happened since the deep economic depression of the late 1880s and early 1890s. The pandemic caused a recession as steep as any in U.S. history. The aftermath saw the worst inflation in half a century.

In the three years from January 2020 through December 2022, Trump and Biden both signed a series of major bills to revive and rebuild the U.S. economy. Altogether, they swelled the gross public debt from about $20 billion in January 2017 to nearly $36 billion today. The weight of that debt helped drive interest rates and mortgage rates higher. The burden of the pandemic debt, like learning losses, is likely to be with us for quite a long time.

Yet even while acknowledging all that went wrong, respecting all the lives lost or ruined, reckoning with all the lasting harms of the crisis, we do a dangerous injustice if we remember the story of COVID solely as a story of American failure. In truth, the story is one of strength and resilience.

Scientists did deliver vaccines to prevent the disease and treatments to recover from it. Economic policy did avert a global depression and did rapidly restore economic growth. Government assistance kept households afloat when the world shut down—and new remote-work practices enabled new patterns of freedom and happiness after the pandemic ended.

The virus was first detected in December 2019. Its genome was sequenced within days by scientists collaborating across international borders. Clinical trials for the Pfizer-BioNTech vaccine began in April 2020, and the vaccine was authorized for emergency use by the FDA in December. Additional vaccines rapidly followed, and were universally available by the spring of 2021. The weekly death toll fell by more than 90 percent from January 2021 to midsummer of that year.

The U.S. economy roared back with a strength and power that stunned the world. The initial spike of inflation has subsided. Wages are again rising faster than prices. Growth in the United States in 2023 and 2024 was faster and broader than in any peer economy.

Even more startling, the U.S. recovery outpaced China’s. That nation’s bounceback from COVID‑19 has been slow and faltering. America’s economic lead over China, once thought to be narrowing, has suddenly widened; the gap between the two countries’ GDPs grew from $5 trillion in 2021 to nearly $10 trillion in 2023. The U.S. share of world economic output is now slightly higher than it was in 1980, before China began any of its economic reforms. As he did in 2016, Trump inherits a strong and healthy economy, to which his own reckless policies—notably, his trade protectionism—are the only visible threat.

In public affairs, our bias is usually to pay most attention to disappointments and mistakes. In the pandemic, there were many errors: the partisan dogma of the COVID minimizers; the capitulation of states and municipalities to favored interest groups; the hypochondria and neuroticism of some COVID maximizers. Errors need to be studied and the lessons heeded if we are to do better next time. But if we fail to acknowledge America’s successes—even partial and imperfect successes—we not only do an injustice to the American people. We also defeat in advance their confidence to collectively meet the crises of tomorrow.

Perhaps it’s time for some national self-forgiveness here. Perhaps it’s time to accept that despite all that went wrong, despite how much there was to learn about the disease and how little time there was to learn it, and despite polarized politics and an unruly national character—despite all of that—Americans collectively met the COVID‑19 emergency about as well as could reasonably have been hoped.

The wrong people have profited from the immediate aftermath. But if we remember the pandemic accurately, the future will belong to those who rose to the crisis when their country needed them.

This article appears in the March 2025 print edition with the headline “Why the COVID Deniers Won.”

How the Tariff Whiplash Could Haunt Pricing

The Atlantic

www.theatlantic.com › newsletters › archive › 2025 › 02 › how-the-tariff-whiplash-could-haunt-pricing › 681617

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

When it comes to tariffs for Canada and Mexico, America is ending the week pretty much as it started. Over the course of just a few days, Donald Trump—following up on a November promise—announced 25 percent tariffs on the country’s North American neighbors, caused a panic in the stock market, eked out minor concessions from foreign leaders, and called the whole thing off (for 30 days, at least). But the residue of this week’s blink-and-you-missed-it trade war will stick.

The consensus among economists is that the now-paused tariffs on Canada and Mexico would have caused significant, perhaps even immediate, cost hikes and inflation for Americans. Tariffs on Mexico could have raised produce prices within days, because about a third of America’s fresh fruits and vegetables are imported from Mexico, Ernie Tedeschi, the director of economics at Yale’s Budget Lab, told me in an email. But “uncertainty about tariffs poses a strong risk of fueling inflation, even if tariffs don’t end up going into effect,” he argued. Tedeschi noted that “one of the cornerstone findings of economics over the past 50 years is the importance of expectations” when it comes to inflation. Consumers, nervous about inflation, may change their behavior—shifting their spending, trying to find higher-paying jobs, or asking for more raises—which can ultimately push up prices in what Tedeschi calls a “self-fulfilling prophecy.”

The drama of recent days may also make foreign companies balk at the idea of entering the American market. During Trump’s first term, domestic industrial production decreased after tariffs were imposed. Although Felix Tintelnot, an economics professor at Duke, was not as confident as Tedeschi is about the possibility of unimposed tariffs driving inflation, he suggested that the threats could have ripple effects on American business: “Uncertainty by itself is discouraging to investments that incur big onetime costs,” he told me. In sectors such as the auto industry, whose continental supply chains rely on border crossing, companies might avoid new domestic projects until all threats of a trade war are gone (which, given the persistence of Trump’s threats, may be never). That lack of investment could affect quality and availability, translating to higher costs down the line for American buyers. Some carmakers and manufacturers are already rethinking their operations, just in case.

And the 10 percent tariffs on China (although far smaller than the 60 percent Trump threatened during his campaign) are not nothing, either. These will hit an estimated $450 billion of imports—for context, last year, the United States imported about $4 trillion in foreign goods—and China has already hit back with new tariffs of its own. Yale’s Budget Lab found that the current China tariffs will raise overall average prices by 0.1 to 0.2 percent. Tariffs, Tedeschi added, are regressive, meaning they hurt lower-earning households more than high-income ones.

Even the most attentive companies and shoppers might have trouble anticipating how Trump will handle future tariffs. Last month, he threatened and then dropped a tariff on Colombia; this week, he hinted at a similar threat against the European Union. There is a case to be made that Trump was never serious about tariffs at all—they were merely a way for him to appear tough on trade and flex his power on the international stage. And although many of the concessions that Mexico and Canada offered were either symbolic or had been in the works before the tariff threats, Trump managed to appear like the winner to some of his supporters.

Still, the longest-lasting damage of the week in trade wars may be the solidification of America’s reputation as a fickle ally. As my colleague David Frum wrote on Wednesday, the whole episode leaves the world with the lesson that “countries such as Canada, Mexico, and Denmark that commit to the United States risk their security and dignity in the age of Trump.”

Related:

The tariffs were never real. How Trump lost his trade war

Here are three new stories from The Atlantic:

The government’s computing experts say they are terrified. Trump takes over the Kennedy Center. Gary Shteyngart: The man in the midnight-blue six-ply Italian-milled wool suit

Today’s News

A federal judge said he would issue a temporary restraining order that would pause parts of the Trump administration’s plan to slash the USAID workforce and withdraw employees from their overseas posts. Donald Trump met with Japanese Prime Minister Shigeru Ishiba at the White House, where they discussed reducing the U.S.’s trade deficit with Japan. A plane carrying 10 people went missing in western Alaska while en route from Unalakleet to Nome.

Dispatches

The Books Briefing: Boris Kachka examines a new, unbearably honest kind of writing. Atlantic Intelligence: For a time, it took immense wealth—not to mention energy—to train powerful new AI models, Damon Beres writes. “That may no longer be the case.”

Explore all of our newsletters here.

Evening Read

Illustration by Akshita Chandra / The Atlantic. Sources: Getty; Wikimedia Commons.

The Rise of the Selfish Plutocrats

By Brian Klaas

The role of the ultra-wealthy has morphed from one of shared social responsibility and patronage to the freewheeling celebration of selfish opulence. Rather than investing in their society—say, by giving alms to the poor, or funding Caravaggios and cathedrals—many of today’s plutocrats use their wealth to escape to private islands, private Beyoncé concerts, and, above all, extremely private superyachts. One top Miami-based “yacht consultant” has dubbed itself Medici Yachts. The namesake recalls public patronage and social responsibility, but the consultant’s motto is more fitting for an era of indulgent billionaires: “Let us manage your boat. For you is only to smile and make memories.”

Read the full article.

More From The Atlantic

Paranoia is winning. Americans are trapped in an algorithmic cage. A Greenland plot more cynical than fiction Civil servants are not America’s enemies. The challenges the U.S. would face in Gaza

Culture Break

Illustration by The Atlantic. Sources: Courtesy of Sundance Institute; Neon Films/Rosamont; Luka Cyprian; A24; Lars Erlend Tubaas Øymo.

Stay in the loop. Here are 10 indie movies you should watch for in 2025.

Discover. David Lynch’s work was often described as “mysterious” or “surreal”—but the emotions it provoked were just as fundamental, K. Austin Collins writes.

Play our daily crossword.

Stephanie Bai contributed to this newsletter.

When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic.

The Race-Blind College-Admissions Era Is Off to a Weird Start

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 02 › affirmative-action-yale-admissions › 681541

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

When colleges began announcing the makeup of their incoming freshman classes last year—the first admissions cycle since the Supreme Court outlawed affirmative action—there seemed to have been some kind of mistake. The Court’s ruling in Students for Fair Admissions v. Harvard had been almost universally expected to produce big changes. Elite universities warned of a return to diversity levels not seen since the early 1960s, when their college classes had only a handful of Black students.

And yet, when the numbers came in, several of the most selective colleges in the country reported the opposite results. Yale, Dartmouth, Northwestern, the University of Virginia, Wesleyan, Williams, and Bowdoin all ended up enrolling more Black or Latino students, or both. Princeton and Duke appear to have kept their demographics basically stable.

These surprising results raise two competing possibilities. One is that top universities can preserve racial diversity without taking race directly into account in admissions. The other, favored by the coalition that successfully challenged affirmative action in court, is that at least some of the schools are simply ignoring the Supreme Court’s ruling—that they are, in other words, cheating. Finding out the truth will likely require litigation that could drag on for years. Although affirmative action was outlawed in 2023, the war over the use of race in college admissions is far from over.

History strongly suggested that the end of affirmative action would be disastrous for diversity in elite higher education. (Most American colleges accept most applicants and therefore didn’t use affirmative action in the first place.) In the states that had already banned the practice for public universities, the share of Black and Latino students enrolled at the most selective flagship campuses immediately plummeted. At UC Berkeley, for example, underrepresented minorities made up 22 percent of the freshman class in 1997. In 1998, after California passed its affirmative-action ban, that number fell to 11 percent. Many of these schools eventually saw a partial rebound, but not enough to restore their previous demographic balance.

Something similar happened at many selective schools in the aftermath of the Supreme Court’s 2023 ruling. At Harvard and MIT, for example, Black enrollment fell by more than 28 and 60 percent, respectively, compared with the average of the two years prior to the Court’s decision. But quite a few institutions defied expectations. At Yale, Black and Latino enrollment increased, while Asian American enrollment fell by 16 percent compared with recent years. Northwestern similarly saw its Black and Latino populations increase by more than 10 percent, while Asian and white enrollment declined. (In Students for Fair Admissions, the Court had found that Harvard’s race-conscious admissions policies discriminated against Asian applicants.)

[Rose Horowitch: The perverse consequences of tuition-free medical school]

Figuring out how this happened is not easy. Universities have always been cagey about how they choose to admit students; the secrecy ostensibly prevents students from trying to game the process. (It also prevents embarrassment: When details have come out, usually through litigation, they have typically not been flattering.) Now, with elite-college admissions under more scrutiny than usual, they’re even more wary of saying too much. When I asked universities for further details about their response to the ruling, Dartmouth, Bowdoin, and Williams declined to comment, Yale and Northwestern pointed me toward their vague public statements, and a Princeton spokesperson said that “now race plays no role in admissions decisions.” Duke did not reply to requests for comment.

The information gap has led outside observers to piece together theories with the data they do have. One possibility is that universities such as Yale and Princeton are taking advantage of some wiggle room in the Supreme Court’s ruling. “Nothing in this opinion should be construed as prohibiting universities from considering an applicant’s discussion of how race affected his or her life, be it through discrimination, inspiration, or otherwise,” Chief Justice John Roberts wrote in his majority opinion. This seemed to provide an indirect way to preserve race-consciousness in admissions. “It’s still legal to pursue diversity,” Sonja Starr, a law professor at the University of Chicago, told me. Her research shows that 43 of the 65 top-ranked universities have essay prompts that ask applicants about their identity or adversity; eight made the addition after the Court’s decision.

Another theory is that universities have figured out how to indirectly preserve racial diversity by focusing on socioeconomic status rather than race itself. In 2024, Yale’s admissions department began factoring in data from the Opportunity Atlas, a project run by researchers at Harvard and the U.S. Census Bureau that measures the upward mobility of children who grew up in a given neighborhood. It also increased recruitment and outreach in low-income areas. Similarly, Princeton announced that it would aim to increase its share of students who are eligible for financial aid. “In the changed legal environment, the University’s greatest opportunity to attract diverse talent pertains to socioeconomic diversity,” a committee designed to review race-neutral admissions policies at the college wrote.

Some evidence supports the “socioeconomics, not race” theory. Dartmouth announced that it had increased its share of low-income students eligible for federal Pell grants by five percentage points. Yale has said that last year’s incoming freshman class would have the greatest share of first-generation and low-income students in the university’s history. Richard Kahlenberg, a longtime proponent of class-based affirmative action who testified on behalf of the plaintiffs challenging Harvard’s admissions policies, told me that, by increasing economic diversity as a proxy for race, elite colleges have brought in the low-income students of color whom purely race-based affirmative action had long allowed them to overlook. (In recent years, almost three-quarters of the Black and Hispanic students at Harvard came from the wealthiest 20 percent of those populations nationally.) “While universities had been claiming that racial preferences were the only way they could create racial diversity, in fact, if we assume good faith on the part of the universities, they have found ways to achieve racial diversity without racial preferences,” Kahlenberg said.

[Richard Kahlenberg: The affirmative action that colleges really need]

If we assume good faith—that’s a big caveat. Not everyone is prepared to give universities the benefit of the doubt. Edward Blum, the president of Students for Fair Admissions, the plaintiff in the case that ended affirmative action, has already accused Yale, Princeton, and Duke of cheating. And Richard Sander, a law professor at UCLA and a critic of affirmative action, said that if a university’s Black enrollment numbers are still above 10 percent, “then I don’t think there’s any question that they’re engaged in illegal use of preferences.”

The skeptics’ best evidence is the fact that the universities accused of breaking the rules haven’t fully explained how they got their results. Yale, for example, has touted its use of the Opportunity Atlas, but hasn’t shared how it factors information from the tool into admissions decisions. Before the Court’s ruling, a Black student was four times more likely to get into Harvard than a white student with comparable scores, and a Latino applicant about twice as likely.

To keep numbers stable, race-neutral alternatives would have to provide a comparable boost. According to simulations presented to the Supreme Court, universities would have to eliminate legacy and donor preferences and slightly lower their average SAT scores to keep demographics constant without considering race. (In oral arguments, one lawyer compared the change in test scores to moving “from Harvard to Dartmouth.”) With minor exceptions, selective universities have given no indication that they’ve made either of those changes.

Even the data that exist are not totally straightforward to interpret. Some universities have reported an uptick in the percentage of students who chose not to report their race in their application. If that group skews white and Asian, as research suggests it does, then the reported share of Black and Latino students could be artificially inflated. And then there’s the question of how many students choose to accept a university’s offer of admission, which schools have little control over. Wesleyan, for example, accepted fewer Black applicants than it had in prior years, Michael Roth, the university’s president, told me. But a larger share chose to matriculate—possibly, Roth said, because even-more-selective schools had rejected them. The University of Virginia similarly had an unusually high yield among Black students, according to Greg Roberts, its dean of admissions. He couldn’t tell whether this was thanks to the school’s outreach efforts or just a coincidence. “I think what we’re doing is important, but to the extent it will consistently impact what the class looks like, I have no idea,” he told me. (Both Roth and Roberts, the only university administrators who agreed to be interviewed for this article, assured me that their institutions had obeyed the Court’s ruling.)

None of those alternative explanations is likely to sway the people who are convinced the schools cheated. With Donald Trump back in office, colleges that don’t see a meaningful uptick in Asian enrollees will likely face civil-rights investigations, says Josh Dunn, a law professor at the University of Tennessee at Knoxville. “If everything ends up looking exactly like it did prior to SFFA,” he told me, then the courts will “probably think that the schools were not trying to comply in good faith.”

Blum, the head of Students for Fair Admissions, has already threatened to sue Yale, Princeton, and Duke if they don’t release numbers proving to his satisfaction that they have complied with the law. (Blum declined to be interviewed for this article.) A new lawsuit could force universities to turn over their admissions data, which should reveal what’s really going on. It could also invite the Court to weigh in on new questions, including the legality of race-neutral alternatives to affirmative action that are adopted with racial diversity in mind. A resolution to any of these issues would take years to arrive.

In many ways, the endless fight over affirmative action is a proxy for the battle over what uber-selective universities are for. Institutions such as Harvard and Yale have long been torn between conflicting aims: on the one hand, creating the next generation of leaders out of the most accomplished applicants; on the other, serving as engines of social mobility for promising students with few opportunities. It will take much more than the legal demise of affirmative action to put that debate to rest.

America Wouldn’t Know the Worst of a Vaccine Decline Until It’s Too Late

The Atlantic

www.theatlantic.com › health › archive › 2025 › 01 › rfk-jr-vaccine-decline › 681489

Becoming a public-health expert means learning how to envision humanity’s worst-case scenarios for infectious disease. For decades, though, no one in the U.S. has had to consider the full danger of some of history’s most devastating pathogens. Widespread vaccination has eliminated several diseases—among them, measles, polio, and rubella—from the country, and kept more than a dozen others under control. But in the past few years, as childhood-vaccination rates have dipped nationwide, some of infectious disease’s ugliest hypotheticals have started to seem once again plausible.

The new Trump administration has only made the outlook more tenuous. Should Robert F. Kennedy Jr., one of the nation’s most prominent anti-vaccine activists, be confirmed as the next secretary of Health and Human Services, for instance, his actions could make a future in which diseases resurge in America that much more likely. His new position would grant him substantial power over the FDA and the CDC, and he is reportedly weighing plans—including one to axe a key vaccine advisory committee—that could prompt health-care providers to offer fewer shots to kids, and inspire states to repeal mandates for immunizations in schools. (Kennedy’s press team did not respond to a request for comment.)

Kennedy’s goal, as he has said, is to offer people more choice, and many Americans likely would still enthusiastically seek out vaccines. Most Americans support childhood vaccination and vaccine requirements for schools; a KFF poll released today found, though, that even in the past year the proportion of parents who say they skipped or delayed shots for their children has risen, to one in six. The more individuals who choose to eschew vaccination, the closer those decisions would bring society’s collective defenses to cracking. The most visceral effects might not be obvious right away. For some viruses and bacteria to break through, the country’s immunization rates may need to slip quite a bit. But for others, the gap between no outbreak and outbreak is uncomfortably small. The dozen experts I spoke with for this story were confident in their pessimism about how rapidly epidemics might begin.

[Read: How America’s fire wall against disease starts to fail]

Paul Offit, a pediatrician at Children’s Hospital of Philadelphia and co-inventor of one of the two rotavirus vaccines available in the U.S., needs only to look at his own family to see the potential consequences. His parents were born into the era of the deadly airway disease diphtheria; he himself had measles, mumps, rubella, and chickenpox, and risked contracting polio. Vaccination meant that his own kids didn’t have to deal with any of these diseases. But were immunization rates to fall too far, his children’s children very well could. Unlike past outbreaks, those future epidemics would sweep across a country that, having been free of these diseases for so long, is no longer equipped to fight them.

“Yeah,” Offit said when I asked him to paint a portrait of a less vaccinated United States. “Let’s go into the abyss.”

Should vaccination rates drop across the board, one of the first diseases to be resurrected would almost certainly be measles. Experts widely regard the viral illness, which spreads through the air, as the most infectious known pathogen. Before the measles vaccine became available in 1963, the virus struck an estimated 3 million to 4 million Americans each year, about 1,000 of whom would suffer serious swelling of the brain and roughly 400 to 500 of whom would die. Many survivors had permanent brain damage. Measles can also suppress the immune system for years, leaving people susceptible to other infections.

Vaccination was key to ridding the U.S. of measles, declared eliminated here in 2000. And very high rates of immunity—about 95 percent vaccine coverage, experts estimate—are necessary to keep the virus out. “Just a slight dip in that is enough to start spurring outbreaks,” Boghuma Kabisen Titanji, an infectious-disease physician at Emory University, told me. Which has been exactly the case. Measles outbreaks do still occur in American communities where vaccination rates are particularly low, and as more kids have missed their MMR shots in recent years, the virus has found those openings. The 16 measles outbreaks documented in the U.S. in 2024 made last year one of the country’s worst for measles since the turn of the millennium.

But for all measles’ speed, “I would place a bet on whooping cough being first,” Samuel Scarpino, an infectious-disease modeler at Northeastern University, told me. The bacterial disease can trigger months of coughing fits violent enough to fracture ribs. Its severest consequences include pneumonia, convulsions, and brain damage. Although slower to transmit than measles, it has never been eliminated from the U.S., so it’s poised for rampant spread. Chickenpox poses a similar problem. Although corralled by an effective vaccine in the 1990s, the highly contagious virus still percolates at low levels through the country. Plenty of today’s parents might still remember the itchy blisters it causes as a rite of passage, but the disease’s rarer complications can be as serious as sepsis, uncontrolled bleeding, and bacterial infections known as “flesh-eating disease.” And the disease is much more serious in older adults.

Those are only some of the diseases the U.S. could have to deal with. Kids who get all of the vaccines routinely recommended in childhood are protected against 16 diseases—each of which would have some probability of making a substantial comeback, should uptake keep faltering. Perhaps rubella would return, infecting pregnant women, whose children could be born blind or with heart defects. Maybe meningococcal disease, pneumococcal disease, or Haemophilus influenzae disease, each caused by bacteria commonly found in the airway, would skyrocket, and with them rates of meningitis and pneumonia. The typical ailments of childhood—day-care colds, strep throat, winter norovirus waves—would be joined by less familiar and often far more terrifying problems: the painful, swollen necks of mumps; the parching diarrhea of rotavirus; the convulsions of tetanus. For far too many of these illnesses, “the only protection we have,” Stanley Plotkin, a vaccine expert and one of the developers of the rubella vaccine, told me, “is a vaccine.”

Exactly how and when outbreaks of these various diseases could play out—if they do at all—is impossible to predict. Vaccination rates likely wouldn’t fall uniformly across geographies and demographics. They also wouldn’t decrease linearly, or even quickly. People might more readily refuse vaccines that were developed more recently and have been politicized (think HPV or COVID shots). And existing immunity could, for a time, still buffer against an infectious deluge, especially from pathogens that remain quite rare globally. Polio, for instance, would be harder than measles to reestablish in the United States: It was declared eliminated from the Americas in the 1990s, and remains endemic to only two countries. This could lead to a false impression that declining vaccination rates have little impact.

A drop in vaccination rates, after all, doesn’t guarantee an outbreak—a pathogen must first find a vulnerable population. This type of chance meeting could take years. Then again, infiltrations might not take long in a world interconnected by travel. The population of this country is also more susceptible to disease than it has been in past decades. Americans are, on average, older; obesity rates are at a historical high. The advent of organ transplants and cancer treatments has meant that a substantial sector of the population is immunocompromised; many other Americans are chronically ill. Some of these individuals don’t mount protective responses to vaccinations at all, which leaves them reliant on immunity in others to keep dangerous diseases at bay.

If various viruses and bacteria began to recirculate in earnest, the chance of falling ill would increase even for healthy, vaccinated adults. Vaccines don’t offer comprehensive or permanent protection, and the more pathogen around, the greater its chance of breaking through any one person’s defenses. Immunity against mumps and whooping cough is incomplete, and known to wane in the years after vaccination. And although immunity generated by the measles vaccine is generally thought to be quite durable, experts can’t say for certain how durable, Bill Hanage, an infectious-disease epidemiologist at Harvard’s School of Public Health, told me: The only true measure would be to watch the virus tear through a population that hasn’t dealt with it in decades.

Perhaps the most unsettling feature of a less vaccinated future, though, is how unprepared the U.S. is to confront a resurgence of pathogens. Most health-care providers in the country no longer have the practical knowledge to diagnose and treat diseases such as measles and polio, Kathryn Edwards, a pediatrician at Vanderbilt University, told me: They haven’t needed it. Many pediatricians have never even seen chickenpox outside of a textbook.

To catch up, health-care providers would need to familiarize themselves with signs and symptoms they may have seen only in old textbooks or in photographs. Hospitals would need to use diagnostic tests that haven’t been routine in years. Some of those tools might be woefully out of date, because pathogens have evolved; antibiotic resistance could also make certain bacterial infections more difficult to expunge than in decades prior. And some protocols may feel counterintuitive, Offit said: The ultra-contagiousness of measles could warrant kids with milder cases being kept out of health-care settings, and kids with Haemophilus influenzae might need to be transported to the hospital without an ambulance, to minimize the chances that the stress and cacophony would trigger a potentially lethal spasm.

[Read: Here’s how we know RFK Jr. is wrong about vaccines]

The learning curve would be steep, Titanji said, stymieing care for the sick. The pediatric workforce, already shrinking, might struggle to meet the onslaught, leaving kids—the most likely victims of future outbreaks—particularly susceptible, Sallie Permar, the chief pediatrician at NewYork–Presbyterian/Weill Cornell Medical Center, told me. If already overstretched health-care workers were further burdened, they’d be more likely to miss infections early on, making those cases more difficult to treat. And if epidemiologists had to keep tabs on more pathogens, they’d have less capacity to track any single infectious disease, making it easier for one to silently spread.

The larger outbreaks grow, the more difficult they are to contain. Eventually, measles could once again become endemic in the U.S. Polio could soon follow suit, imperiling the fight to eradicate the disease globally, Virginia Pitzer, an infectious-disease epidemiologist at Yale, told me. In a dire scenario—the deepest depths of the abyss—average lifespans in the U.S. could decline, as older people more often fall sick, and more children under 5 die. Rebottling many of these diseases would be a monumental task. Measles was brought to heel in the U.S. only by decades of near-comprehensive vaccination; re-eliminating it from the country would require the same. But the job this time would be different, and arguably harder—not merely coaxing people into accepting a new vaccine, but persuading them to take one that they’ve opted out of.

That future is by no means guaranteed—especially if Americans recall what is at stake. Many people in this country are too young to remember the cost these diseases exacted. But Edwards, who has been a pediatrician for 50 years, is not. As a young girl, she watched a childhood acquaintance be disabled by polio. She still vividly recalls patients she lost to meningitis decades ago. The later stages of her career have involved fewer spinal taps, fewer amputations. Because of vaccines, the job of caring for children, nowadays, simply involves far less death.

How America’s Fire Wall Against Disease Starts to Fail

The Atlantic

www.theatlantic.com › health › archive › 2025 › 01 › rfk-vaccine-acip › 681405

For more than 60 years, vaccination in the United States has been largely shaped by an obscure committee tasked with advising the federal government. In almost every case, the nation’s leaders have accepted in full the group’s advice on who should get vaccines and when. Experts I asked could recall only two exceptions. Following 9/11, the Bush administration expanded the group who’d be given smallpox vaccinations in preparation for the possibility of a bioterrorism attack, and at the height of the coronavirus pandemic, in 2021, the Biden administration added high-risk workers to the groups urged to receive a booster shot. Otherwise, what the Advisory Committee on Immunization Practices (ACIP) has recommended has effectively become the country’s unified vaccination policy.

This might soon change. Robert F. Kennedy Jr., one of the nation’s most prominent anti-vaccine activists and the likely next secretary of Health and Human Services, has said that he would not “take away” any vaccines. But Kennedy, if confirmed, would have the power to entirely remake ACIP, and he has made clear that he wants to reshape how America approaches immunity. Gregory Poland, the president of the Atria Academy of Science and Medicine and a former ACIP member, told me that if he were out to do just that, one of the first things he’d do is “get rid of or substantially change” the committee.

Over the years, the anti-vaccine movement has vehemently criticized ACIP’s recommendations and accused its members of conflicts of interest. NBC News has reported that, in a 2017 address, Kennedy himself said, “The people who are on ACIP are not public-health advocates … They work for the vaccine industry.” Kennedy has not publicly laid out explicit plans to reshuffle the makeup or charter of ACIP, and his press team did not return a request for comment. But should he repopulate ACIP with members whose views hew closer to his own, those alterations will be a bellwether for this country’s future preparedness—or lack thereof—against the world’s greatest infectious threats.

[Read: ‘Make America Healthy Again’ sounds good until you start asking questions]

Before ACIP existed, the task of urging the public to get vaccinated was largely left to professional organizations, such as the American Academy of Pediatrics, or ad hoc groups that evaluated one immunization at a time. By the 1960s, though, so many new vaccines had become available that the federal government saw the benefit of establishing a permanent advisory group. Today, the committee includes up to 19 voting members who are experts drawn from fields such as vaccinology, pediatrics, virology, and public health, serving four-year terms. The CDC solicits nominations for new members, but the HHS secretary, who oversees the CDC and numerous other health-related agencies, ultimately selects the committee; the secretary can also remove members at their discretion. The committee “is intended to be a scientific body, not a political body,” Grace Lee, who chaired ACIP through the end of 2023, told me. ACIP’s charter explicitly states that committee members cannot be employed by vaccine manufacturers, and must disclose real and perceived conflicts of interest.

HHS Secretaries typically do not meddle extensively with ACIP membership or its necessarily nerdy deliberations, Jason Schwartz, a vaccine-policy expert at Yale, told me. The committee’s job is to rigorously evaluate vaccine performance and safety, in public view, then use that information to help the CDC make recommendations for how those immunizations should be used. Functionally, that means meeting for hours at a time to pore over bar graphs and pie charts and debate the minutiae of immunization efficacy. Those decisions, though, have major implications for the country’s defense against disease. ACIP is the primary reason the United States has, since the 1990s, had an immunization schedule that physicians across the country treat as a playbook for maintaining the health of both adults and kids, and that states use to guide school vaccine mandates.

The committee’s decisions have, over the years, turned the tide against a slew of diseases. ACIP steered the U.S. toward giving a second dose of the MMR vaccine to children before elementary school, rather than delaying it until early adolescence, in order to optimally protect kids from a trifecta of debilitating viruses. (Measles was declared eliminated in the U.S. in 2000.) The committee spurred the CDC’s recommendation for a Tdap booster during the third trimester of pregnancy, which has guarded newborn babies against whooping cough. It pushed the country to switch to an inactivated polio vaccine at the turn of the millennium, helping to prevent the virus from reestablishing itself in the country.

[Read: We’re about to find out how much Americans like vaccines]

I reached out to both current ACIP members and the Department of Health and Human Services to ask about Kenndy’s pending influence over the committee. ACIP Chair Helen K. Talbot and other current ACIP members emphasized the group’s importance to keeping the U.S. vaccinated, but declined to comment about politically motivated changes to its membership. The Department of Health and Human Services did not return a request for comment.

Should ACIP end up stacked with experts whose views mirror Kennedy’s, “it’s hard not to imagine our vaccination schedules looking different over the next few years,” Schwartz told me. Altered recommendations might make health-care providers more willing to administer shots to children on a delayed schedule, or hesitate to offer certain shots to families at all. Changes to ACIP could also have consequences for vaccine availability. Pharmaceutical companies might be less motivated to manufacture new shots for diseases that jurisdictions or health-care providers are no longer as eager to vaccinate against. Children on Medicaid receive free vaccines based on an ACIP-generated list, and taking a particular shot off that roster might mean that those kids will no longer receive that immunization at all.

At one extreme, the new administration could, in theory, simply disband the committee altogether, Schwartz told me, and have the government unilaterally lay down the country’s vaccination policies. At another, the CDC director, who has never been beholden to the committee’s advice, could begin ignoring it more often. (Trump’s choice to lead the CDC, the physician and former Florida congressman Dave Weldon, has been a critic of the agency and its vaccine program.) Most likely, though, the nation’s new health leaders will choose to reshape the committee into one whose viewpoints would seem to legitimize their own. The effects of these choices might not be obvious at first, but a committee that has less academic expertise, spends less time digging into scientific data, and is less inclined to recommend any vaccines could, over time, erode America’s defenses—inviting more disease, and more death, all of it preventable.

The Saint America Needs Now

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 02 › saint-francis-counterculture-charity-kindness › 681095

It’s a peculiar symptom of where we’re at—caught between phases of consciousness, between the ruins of one world and the unknown shape of the next—to be seeing two things at the same time. Or to be seeing the same thing in two ways simultaneously. Stuck in the transition, we’re condemned to a species of double vision: cross-eyed, as it were, in the cross-fade. And sometimes, sometimes, this can be quite useful. When you meet a guy, for example, like Francis of Assisi.

Genius or crackpot? Both. Sensuous embracer of life or self-mortifying freak? Both. Exhibitionist or recluse? Anarchist or company man? Runaway rich kid or true voice of the rejected? Both, both, and both. And when God spoke to him in 1206, his voice issuing from a crucifix and saying, “Francis, do you not see that my house is falling into ruin? Go, therefore, and repair the house,” did God mean the dilapidated, bat-flitted, holes-in-the-roof church in which Francis, at that moment, happened to be kneeling? Or did he mean the whole of medieval Christendom? He meant, of course—are you getting the idea?—both.

Volker Leppin’s Francis of Assisi, newly translated from the German by Rhys S. Bezzant, is subtitled The Life of a Restless Saint, and the restlessness of the subject is shared by the author. His book, Leppin writes, “does not present itself as a biography in the classic sense.” Which is not to suggest that Leppin, a professor of historical theology at Yale, has written some kind of jazzy meta-book. But Francis of Assisi does have double vision, maneuvering constantly between hagiography and history, legend and fact, heaven and Earth, miracles and—what’s the opposite of miracles? Leppin comes not to debunk but rather to discover in what fashion those early, physics-defying accounts of Francis, the tales told within the blast radius of his actual presence, might be understood as true.

Francis was born around 1181, in Assisi in central Italy, the son of a well-to-do merchant named Pietro di Bernardone. After that, the story gets hazy. Some versions would have him quite a nicely behaved youth; in others, the more fun ones, he’s a profligate, a sybarite, a tearaway. Seeking honors on the battlefield, he signs up for one of the endless local town-on-town skirmishes, only to be swiftly captured and imprisoned. When he gets out, a year or so later, the changes begin: conversion.

[From the August 2000 issue: Being St. Francis]

Francis tears off his fancy clothes; he kisses lepers; he starts begging. It’s all a bit unbalanced. He turns his back on privilege and plunges madly downward. (Perhaps this is the point in the story at which Francis—were he trying something similar today, here in America—would find himself scooped up by psychiatry and institutionalized, or at the very least heavily medicated, at the behest of his family maybe, or he’d go rattling unattended into the tunnels of the justice system.)

Desperate to impoverish himself, he tries to donate a large amount of his father’s money to a local church; the priest, afraid of Bernardone Sr., refuses it, whereupon Francis—the anti-alchemist, King Midas in reverse, turning gold back to base metal—casts the money scornfully aside, “valuing it,” as Saint Bonaventure wrote in his 13th-century Life of Saint Francis, “no more than dust that is trodden under foot.”

But gradually, via great humiliations, a stint in a cave, and a complete rupture with his father, these lungings and impetuosities resolve themselves into the properly achieved Franciscan humor, a kind of continual outrageous sanctity. Francis becomes Francis, and he begins to attract followers. What he’s doing is pretty straightforward. He’s living—actually living—by the words of Jesus: Love your neighbor, give it all away, praise God, and don’t worry about tomorrow.

[From the June 2022 issue: How politics poisoned the evangelical Church]

Pretty straightforward, and a head-on challenge to the world. It is no longer enough, for example, to give alms to the lepers and walk off feeling pious: Now, like Francis and his brothers, you have to accompany the lepers. You have to stand with them in what Leppin calls “the world of the excluded,” of the lowest in society, which in the cosmic reversal effected by the Gospels turns out to be the highest place on Earth.

To get in touch with the miraculous Francis, the folkloric Francis, read the Fioretti, or The Little Flowers of St. Francis, a 14th-century collection of tales about the saint and his friars. It’s a beautiful book. Here we find Francis “raising his face to heaven” like a solar panel, taming wolves and preaching to the birds and subsisting for weeks on half a loaf of bread to “cast the venom of vainglory from him.” We see him healing a leper, and then, when that leper dies (“of another infirmity”) a couple of weeks later, encountering the man’s heaven-bound soul whooshing past him in a wood.

We see him—in a typically self-condemning mood, regarding himself as the vilest of sinners and the basest of men—earnestly instruct Brother Leo to tell him, “Truly thou dost merit the deepest hell.” And Leo tries to say it—he tries his best—but when he opens his mouth, what comes bulbing out instead, Jim Carrey–style, is, “God will perform so many good works through thee that thou shalt go to paradise.” Francis, peeved, renews the effort, enjoining Leo this time to tell him, “Verily thou art worthy of being numbered among the accursed.” Again Leo assents, but the words that come through him, rebelliously, are, “O Friar Francis, God will do in such wise that among the blessed thou shalt be singularly blessed.” And repeat. It has the rhythm of an SNL sketch. We also meet the amazing, more-Francis-than-Francis Brother Juniper, a figure of such affronting innocence that Francis himself, when he’s wrangling a particularly tenacious demon, simply has to mention Juniper’s name to make the demon flee.

G. K. Chesterton wrote very beautifully about Francis. For him, the saint’s jangling polarities resolve themselves quite naturally if we imagine him as a lover: Francis was in love with God, so he did all the crazy zigzag things that lovers do. The feats, the ecstasies, the prostrations and abnegations. And he loved the Church too. “Francis,” Leppin notes, “certainly did not engage in any polemic against the clergy.” It never occurred to him to question directly the institutions and practices of Catholicism: The polemic, so to speak, was himself. The story goes that when he went to Rome to get Pope Innocent III’s blessing, and Innocent said something waspish about him looking like a swineherd, Francis left the papal court, found a couple of pigs in the street, rolled around companionably in their pig-mess, and then came back.

Did that really happen? Does it matter? A story like that, we need it to be true. And right now we need Saint Francis. Now that kindness is countercultural, we need his extremes of wild charity to pull us back toward it. And we need his asceticism: His self-denial, his merry disdain of health and comfort and security, is a rebuke to our self-care. There are no safe spaces, and no guarantees—the only stability is the bottomlessness of divine love. The trapdoor held open by grace. So we take the hand of Francis, and down we go.

This article appears in the February 2025 print edition with the headline “The Wild Charity of Saint Francis.”

The Coalition Collapse That Doomed Biden’s Presidency

The Atlantic

www.theatlantic.com › politics › archive › 2025 › 01 › coalition-collapse-biden-carter › 681254

Presidents whom most voters view as failures, justifiably or not, have frequently shaped American politics long after they leave office—notably, by paving the way for presidencies considered much more successful and consequential. As President Joe Biden nears his final days in office, his uneasy term presents Democrats with some uncomfortable parallels to their experience with Jimmy Carter, whose state funeral takes place this week in Washington, D.C.

The former Georgia governor’s victory in 1976 initially offered the promise of revitalizing the formidable electoral coalition that had delivered the White House to Democrats in seven of the nine presidential elections from 1932 (won by Franklin D. Roosevelt) to 1964 (won by Lyndon B. Johnson), and had enabled the party to enact progressive social policies for two generations. But the collapse of his support over his four years in office, culminating in his landslide defeat by Ronald Reagan in 1980, showed that Carter’s electoral victory was instead that coalition’s dying breath. Carter’s troubled term in the White House proved the indispensable precondition to Reagan’s landmark presidency, which reshaped the competition between the two major parties and enabled the epoch-defining ascendancy of the new right.

The specter of such a turnabout now haunts Biden and his legacy. Despite his many accomplishments in the White House, the November election’s outcome demonstrated that his failures—particularly on the public priorities of inflation and the border—eclipsed his successes for most voters. As post-election surveys made clear, disapproval of the Biden administration’s record was a liability that Vice President Kamala Harris could not escape.

Biden’s unpopularity helped Donald Trump make major inroads among traditionally Democratic voting blocs, just as the widespread discontent over Carter’s performance helped Reagan peel away millions of formerly Democratic voters in 1980. If Trump can cement in office the gains he made on Election Day—particularly among Latino, Asian American, and Black voters—historians may come to view Biden as the Carter to Trump’s Reagan.

In his landmark 1993 book, The Politics Presidents Make, the Yale political scientist Stephen Skowronek persuasively argued that presidents succeed or fail according to not only their innate talents but also the timing of their election in the long-term cycle of political competition and electoral realignment between the major parties.

Most of the presidents who are remembered as the most successful and influential, Skowronek showed, came into office after decisive elections in which voters sweepingly rejected the party that had governed the country for years. The leaders Skowronek places in this category include Thomas Jefferson after his election in 1800, Andrew Jackson in 1828, Abraham Lincoln in 1860, Roosevelt in 1932, and Reagan in 1980.

These dominating figures, whom Skowronek identifies as men who “stood apart from the previously established parties,” typically rose to prominence with a promise “to retrieve from a far distant, even mythic, past fundamental values that they claimed had been lost.” Trump fits this template with his promises to “make America great again,” and he also displays the twin traits that Skowronek describes as characteristic of these predecessors that Trump hopes to emulate: repudiating the existing terms of political competition and becoming a reconstructive leader of a new coalition.

The great repudiators, in Skowronek’s telling, were all preceded by ill-fated leaders who’d gained the presidency representing a once-dominant coalition that was palpably diminished by the time of their election. Skowronek placed in this club John Adams, John Quincy Adams, Franklin Pierce, James Buchanan, Herbert Hoover, and Carter. Each of their presidencies represented a last gasp for the party that had won most of the general elections in the years prior. None of these “late regime” presidents, as Skowronek called them, could generate enough success in office to reverse their party’s declining support; instead, they accelerated it.

The most recent such late-regime president, Carter, was elected in 1976 after Richard Nixon’s victories in 1968 and 1972 had already exposed cracks in the Democrats’ New Deal coalition of southerners, Black voters, and the white working class. Like many of his predecessors in the dubious fraternity of late-regime presidents, Carter recognized that his party needed to recalibrate its message and agenda to repair its eroding support. But the attempt to set a new, generally more centrist direction for the party foundered.

Thanks to rampant inflation, energy shortages, and the Iranian hostage crisis, Carter was whipsawed between a rebellion from the left (culminating in Senator Edward Kennedy’s primary challenge) and an uprising on the right led by Reagan. As Carter limped through his 1980 reelection campaign, Skowronek wrote, he had become “a caricature of the old regime’s political bankruptcy, the perfect foil for a repudiation of liberalism itself as the true source of all the nation’s problems.”

Carter’s failures enabled Reagan to entrench the electoral realignment that Nixon had started. In Reagan’s emphatic 1980 win, millions of southern white conservatives, including many evangelical Christians, as well as northern working-class white voters renounced the Democratic affiliation of their parents and flocked to Reagan’s Republican Party. Most of those voters never looked back.

The issue now is whether Biden will one day be seen as another late-regime president whose perceived failures hastened his party’s eclipse among key voting blocs. Pointing to his record of accomplishments, Biden advocates would consider the question absurd: Look, they say, at the big legislative wins, enormous job growth, soaring stock market, historic steps to combat climate change, skilled diplomacy that united allies against Russia’s invasion of Ukraine, and boom in manufacturing investment, particularly in clean-energy technologies.

In electoral terms, however, Biden’s legacy is more clouded. His 2020 victory appeared to revive the coalition of college-educated whites, growing minority populations, young people, and just enough working-class white voters that had allowed Bill Clinton and Barack Obama to win the White House in four of the six elections from 1992 through 2012. (In a fifth race over that span, Al Gore won the popular vote even though he lost the Electoral College.) But the public discontent with Biden frayed almost every strand of that coalition.

Biden made rebuilding his party’s support among working-class voters a priority and, in fact, delivered huge gains in manufacturing and construction jobs that were tied to the big three bills he passed (on clean energy, infrastructure, and semiconductors). But public anger at the rising cost of living contributed to Biden’s job-approval rating falling below 50 percent in the late summer of 2021 (around the time of the chaotic Afghanistan withdrawal), and it never climbed back to that crucial threshold. On Election Day, public disappointment with Biden’s overall record helped Trump maintain a crushing lead over Harris among white voters without a college degree, as well as make unprecedented inroads among nonwhite voters without a college degree, especially Latinos.