Itemoids

China

The Tesla Revolt

The Atlantic

www.theatlantic.com › technology › archive › 2025 › 02 › tesla-elon-doge › 681666

Donald Trump may be pleased enough with Elon Musk, but even as the Tesla CEO is exercising his newfound power to essentially undo whole functions of the federal government, he still has to reassure his investors. Lately, Musk has delivered for them in one way: The value of the company’s shares has skyrocketed since Trump was reelected to the presidency of the United States. But Musk had much to answer for on his recent fourth-quarter earnings call—not least that in 2024, Tesla’s car sales had sunk for the first time in a decade. Profits were down sharply too. Usually, when this happens at a car company, the CEO issues a mea culpa, vows to cut costs, and hypes vehicles coming to market soon.

Instead, Musk beamed about robotics, artificial intelligence, and Tesla’s path to being “worth more than the next top five companies combined.” This is the vision he has been selling investors for years: Making cars—a volatile, hypercompetitive business with infamously low profit margins—was only the start for Tesla. Its future business will be making fleets of self-driving taxis and humanoid robots trained for thankless manual labor. Whether his vision has any connection to reality is hotly debated by many AI and robotics experts, but most Wall Street analysts put their faith in Musk. And he has, at times, delivered on wildly ambitious goals. Shares jumped again after the call. (Tesla did respond to a request for comment; a DOGE official did not respond to an email seeking comment.)

Musk gets the benefit of the doubt from investors because—despite undelivered promises, half-baked ideas, and forgotten plans—he has made Tesla worth, on paper at least, more than essentially the rest of the auto industry combined. His funders are asked to buy Musk’s picture of the future, and the recent enthusiasm for Tesla stock suggests they believe that his political influence will help him get there.

Musk needs that belief to hold. Tesla’s stock price is the largest source of his enormous wealth and, by extension, his influence; if his plans succeed, that stock is also his clearest shot at achieving trillionaire status. Right now, though, Tesla’s primary business is still selling cars that people drive, and Musk himself may be the biggest reason that faith in Tesla could falter.

For all of Musk’s ire for the former president, Tesla did very well in the Joe Biden years. The Model Y is the world’s best-selling electric vehicle and its best-selling car, period. The company has comfortably been out of its “money-losing start-up” phase for years. Although the competition among EV makers is heating up, the only individual company close to eating into Tesla’s market share is China’s BYD, which for the first time last year produced more EVs than Tesla did.

Yet that competition can’t entirely account for Tesla’s latest, abysmal numbers. Last year, Tesla sales were down nearly 12 percent in the EV stronghold of California. And in Europe, where Musk is helping supercharge far-right politics, Tesla’s sales were down 63 percent last month in France and 59 percent in Germany. This is happening even as the rest of the worldwide electric market is growing fast; nearly every car company that makes EVs saw sales gains in 2024, some of them huge.

Musk’s activism does seem to be turning off the affluent or middle-income progressive crowd that was traditionally Tesla’s bread and butter. Look no further than how the company’s new, updated Model Y has been received. Musk’s army of fanboys on X was as effusive as ever, but outside the hard-core Tesla bubble, the SUV was met with a flood of Nazi jokes following Musk’s Sieg heil–ish arm gesture at Trump’s inauguration. This type of reaction goes beyond that one car; the Cybertruck has a unique penchant for being the target of vandalism, and people appear to be making a killing selling anti-Musk bumper stickers to disgusted Tesla owners. In covering the auto industry, I can’t go a week without fielding emails from people asking for advice on the best EV alternatives to Tesla—many from longtime Tesla owners who say they’re ready to move on.

In theory, Musk’s rightward turn could help him swap out traditionally liberal buyers for more conservative ones, who usually tend to be more skeptical of EVs. And it’s likely that EVs will become less polarizing along partisan lines over time as electrification becomes more common on new cars. Right now, however, even the deep-red-coded Cybertruck doesn’t seem to be changing many minds about the concept of battery-powered cars. Take a recent report from the EV Politics Project, a nonprofit group that studies the partisan divide over electric cars. Their study indicates that although Musk himself is now viewed much more favorably by Trump voters and Republicans, he’s not leading some seismic shift in how they view EVs.

Nor is he obviously trying to get MAGA voters to buy the new Model Y. In fact, those who follow the auto industry closely wonder if Musk is still interested in running a car company at all. On that January earnings call, he offered only a boilerplate response about “more affordable” new Teslas coming soon; the word Cybertruck was not uttered once. He is, however, clearly focused on the company’s “unsupervised” robotaxi service. (Imagine Uber, except with Model Ys and Model 3 sedans, and with no humans behind the wheel.) He claims that Tesla will launch the taxis in Austin in June, the first step toward turning the company into the AI powerhouse that Musk thinks will make it so valuable. Right now, his AI ventures are separate, but he’s started mingling them with his ambitions for Tesla. Ultimately, his thinking—which he’s articulated in earnings and public appearances over a number of years—is that his roving network of autonomous vehicles can use their cameras to capture huge quantities of data, and those data can be used to train AI networks.

In the immediate future, Musk wants Tesla robotaxis everywhere, as soon as possible, and sleeping next door to the White House could help advance that part of his vision. Critics have expressed concerns that his newfound influence could also help stymie federal investigations into Tesla, which are probing the crash record of the company’s “Full Self-Driving” technology and its claims about the technology’s safety. And his current political position could help eliminate one of his oldest foes: regulations.

Tesla has long clashed with environmental rules (last year, a California judge ordered Tesla to pay $1.5 million over allegations that it mishandled hazardous waste), labor laws (employees at a Tesla plant have said that the company failed to pay overtime, among other alleged violations), and safety ordinances (the company was recently fined for violating California’s workplace-heat-safety rules at one of their plants). But the greatest roadblock to Musk’s vision of robotaxis everywhere is arguably America’s current patchwork of state-by-state rules and regulations for autonomous vehicles, which may allow self-driving cars in some places but not others. No federal standards currently exist, but creating rules favorable to the industry would speed things up—especially if those rules were tailored to especially benefit Tesla.

Musk’s approach to Tesla’s future has more than a few problems that the rest of the self-driving-car industry does not face. Tesla relies solely on cameras and AI for its automated-driving systems, rather than lidar and other more advanced sensors used by competitors. Plenty of experts think Tesla’s strategy can never power true autonomous driving. And Tesla is already years behind robotaxi companies operating in many cities right now, most notably Google’s Waymo. Musk’s Tesla faces the rising threat of China’s advancements in AI too: Investors are noticing BYD’s autonomy-focused deal with tech upstart DeepSeek. And they’re noticing where Musk’s attention lies; as he holds court in the Oval Office, Tesla’s stock has begun losing all those postelection gains.

The most daunting problem, though, may be the same problem Musk has always had: people. Even if he does succeed in tearing down the regulatory state for the sake of his own companies, who’s to say anyone will buy what they’re selling. Any power Musk has in the future depends on turning millions of people into Tesla customers. If he can’t do that—or at least keep convincing investors that he’ll be able to—he’s just another guy screaming online.

The Wrong Case for Foreign Aid

The Atlantic

www.theatlantic.com › international › archive › 2025 › 02 › foreign-aid-trump-usaid › 681652

As Elon Musk and President Donald Trump attempt to unlawfully obliterate USAID, its advocates have focused on the many ways that shutting off foreign aid damages U.S. interests. They argue that it exposes Americans to a greater risk of outbreaks such as Ebola and bird flu, stifles future markets for domestic producers, and cedes the great-power competition to China. These arguments are accurate and important, but they have overtaken a more fundamental—and ultimately more persuasive—reason for the U.S. to invest in foreign aid: It’s essential to America’s identity.

Following World War II, every U.S. president until Trump used his inaugural address to champion foreign aid and invoke the country’s long-held ideals of decency and generosity. They maintained that Americans had a moral duty to help the deprived. Once Trump was elected in 2016, however, U.S. leaders and aid advocates grew reluctant to talk about altruism. President Joe Biden made no mention of the world’s needy in his inaugural address.

I’m as much to blame for this shift as anyone. I served as USAID’s head speechwriter for six years under the past two Democratic administrations. In that role, I prioritized tactical arguments about America’s safety and well-being in order to persuade the shrinking segment of Republicans who were sympathetic to foreign aid. For a time, it worked. During the Biden administration, Congress spared USAID’s budget from the most drastic proposed cuts, and the agency received unprecedented emergency funding to deal with a series of humanitarian disasters, conflicts, and climate catastrophes.

[Read: The cruel attack on USAID]

Today, however, that line of reasoning is failing. Trump, Musk, and their allies are convinced that administering foreign aid weakens America, rather than enriching or securing it. Marco Rubio used to be one of the agency’s biggest supporters; now, as secretary of state, he’s maligning its staff and abetting its demolition.

A more compelling message lies in the fact that Trump and Musk’s foreign-aid freeze could be one of the cruelest acts that a democracy has ever undertaken. In 2011, when Republican members of Congress proposed a 16 percent cut in annual foreign aid, then–USAID Administrator Rajiv Shah conservatively estimated that it would lead to the deaths of 70,000 children. That is more children than died in Hiroshima and Nagasaki. Depending on how thoroughly Trump and Musk are allowed to dismantle USAID, the casualties this time could be worse. (A federal judge has temporarily blocked their plan to put staffers on leave.)

By assaulting the foreign-aid system, Rubio, Musk, and Trump are redefining what it means to be American: small-hearted rather than generous; unexceptional in our selfishness. To respond by arguing that foreign aid simply benefits Americans is to accede to their view, not combat it.

Instead, advocates of foreign aid should appeal to a higher principle: To be American is to care about those in need. The country is already primed for this message. Americans are an exceptionally charitable people, donating more than $500 billion each year. And although polling shows that a narrow majority of Americans want to cut foreign aid in the abstract, they strongly support the specific programs it funds, including disaster relief, food and medicine, women’s education, and promoting democracy.

[Read: Trump’s assault on USAID makes Project 2025 look like child’s play]

That support derives above all from a moral belief. According to a poll by KFF, only 25 percent of respondents cited economic or national-security interests as the most important reason for America to invest in the public health of developing countries. Nearly double—46 percent—said that it’s the right thing to do.

A modern blueprint exists for tapping into Americans’ concern for the world’s poor. During the George W. Bush and Obama administrations, proponents of foreign aid emphasized America’s values ahead of its interests, inspiring communities of faith and galvanizing a nationwide youth movement. Rock stars and celebrities echoed the message, which penetrated pop culture. When an earthquake struck Haiti in 2010, a telethon featuring performances by Beyoncé and Taylor Swift raised $61 million; stars including Leonardo DiCaprio and Julia Roberts staffed the phones. No one mentioned security or prosperity. Empathy was enough.

Today, the political and cultural coalitions that championed foreign aid are severely diminished. The Republicans whom USAID once counted on have gone silent. Few faith leaders or celebrities are calling for foreign aid to resume. No widespread youth movement is demanding that we end poverty now. Proponents, myself included, stopped focusing on inspiring the American people, so it’s no surprise that they are uninspired. But we can motivate them again. We just need to appeal to their hearts as much as their minds.

The Game That Shows We’re Thinking About History All Wrong

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 02 › civilization-7-review › 681656

This is an era of talking about eras. Donald Trump says we’ve just begun a “golden age.” Pundits—responding to the rise of streaming, AI, climate change, and Trump himself—have announced the dawn of post-literacy, post-humanism, and post-neoliberalism. Even Taylor Swift’s tour name tapped into the au courant way of depicting time: not as a river, but as a chapter book. A recent n+1 essay asked, “What does it mean to live in an era whose only good feelings come from coining names for the era (and its feelings)?”

Oddly enough, the new edition of Civilization, Sid Meier’s beloved video-game franchise, suggests an answer to that question. In the six previous Civ installments released since 1991, players guide a culture—such as the Aztecs, the Americas, or the French—from prehistory to modernity. Tribes wielding spears and scrolls grow into global empires equipped with nukes and blue jeans. But Civilization VII, out this month, makes a radical change by firmly segmenting the experience into—here’s that word—eras. At times, the resulting gameplay mirrors the pervasive mood of our present age-between-ages: tedious, janky, stranded on the way to somewhere else.

In many ways, the game plays like a thoughtful cosmetic update. You select a civilization and a leader, with options that aren’t only the obvious ones (all hail Empress Harriet Tubman!). The world map looks ever so fantastical, with postcard-perfect coastlines and mountains resembling tall sandcastles. Then, in addictive turn after turn, you befriend or conquer neighboring tribes (using sleek new systems for war and diplomacy), discover technologies such as the wheel and bronze-working, and cultivate cities filled with art and industry. The big twist is that all the while, an icon on-screen accumulates percentage points. When it gets somewhere above 70 percent, a so-called crisis erupts: Maybe your citizens rebel; maybe waves of outsiders attack. At 100 percent, the game pauses to announce that the “Antiquity Age” is over. Time isn’t just marching on—your civilization is about to molt, caterpillar-style.

[Read: Easy mode is actually for adults]

In each of the two subsequent ages—Exploration, Modern—players pick a new society to transform into. In my first go, my ancient Romans became the Spanish, who sent galleons to distant lands. Then I founded modern America and got to work laying down a railroad network. Over time, my conquistadors retired, and my pagan temples got demolished to make way for grocery stores. Yet certain attributes persisted. For example, the Roman tradition of efficiently constructing civic works made building the Statue of Liberty easier. As I played, the word civilization came to feel newly expansive. I wasn’t running a country; I was tending to a lineage of peoples who had gone by a few names but shared a past, a homeland, self-interest, and that hazy thing called culture.

In the run-up to the game, Civilization’s developers have argued that the eras system is realistic. No nation-state has continuously spanned the thousands of years that a typical Civ game simulates; the closest counterexample might be China, which is playable as three different dynastic forms (plus Mongolia) in this game. Although Civ’s remix of history is always a bit wacky, in my head, I could maintain a plausible-ish narrative to explain why my America’s cities featured millennia-old colonnades (to quote a colleague: Are We Rome?). Each era-ending crisis created a credible kind of drama: In real life, revolutions, reformations, migration, invasion, disasters, and so much else can reshape societies in fundamental ways. The game succeeds at making the case that, as its creators like to say, “history is built in layers.”

Unfortunately, in the most recent version of the game, history also feels overdetermined. Winning in previous Civs meant accomplishing one self-evidently climactic feat—conquering Earth, say, or mastering spaceflight. During the many hours it took to get to that goal, you enjoyed immense freedom to improvise your own path. Civ VII, however, adds on a menu of goals for each era. To succeed in the Antiquity Age, for example, you might build seven Wonders of the World; in modernity, you could mass-produce a certain number of factory goods and then form a world bank. The micro objectives lend each era a sense of a narrative cohesion—but a limiting and predictable kind, less epic novel than completed checklist. Playing Civilization used to feel like living through an endless dawn of possibility. But this time, you’re not in command of history; history is in command of you, and it’s assigning you busywork.

[Read: What will become of American civilization?]

Making matters worse, the complexity of the eras mechanism seems to have encouraged the game’s designers to simplify other features—or, less charitably, to just pay those features less care. I played on what should have been a challenging level of difficulty—four on a six-point scale—but I still smoked the computer-controlled opponents, who seemed programmed to act meekly and unambitiously. Picking your form of government used to feel like an existential choice, but now despotism and oligarchy are hardly differentiated. Complicated ideas have been reduced to childish mini-games: Achieving cultural hegemony in Civ VI meant fostering soft power through a variety of options—curating art museums, building iconic monuments, shipping rock bands off on global tours—but in Civ VII, it’s mostly a matter of sending explorers to random places to dig up artifacts. Luckily, many of these problems seem fixable, and later downloadable updates may make the game richer and more satisfying.

Still, I worry that the dull anxiety that can creep in over a session of Civ VII results from a deeper flaw: the strictly defined ages. I like that the game wants to honor how societies really can change in sweeping, sudden ways. But in gaming and in life, fixating on an episodic view of time—prophecies of rise and fall, cycles of malaise and renewal—can have a diminishing effect on the present. Civilization VII suggests why the what’s-next anxieties of our times, stuck between mourning yesterday and anticipating tomorrow, can be so draining. Time actually doesn’t move in chunks. At best, eras are an imprecise tool to make sense of the messy past, and at worst, they rob us of our sense of agency. It’s healthiest to buy into the old Civilization fantasy, the dream that’s always propelled humans forward: We’re going to last.

Why the COVID Deniers Won

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 03 › covid-deniers-anti-vax-public-health-politics-polarization › 681435

Five years ago, the coronavirus pandemic struck a bitterly divided society.

Americans first diverged over how dangerous the disease was: just a flu (as President Donald Trump repeatedly insisted) or something much deadlier.

Then they disputed public-health measures such as lockdowns and masking; a majority complied while a passionate minority fiercely resisted.

Finally, they split—and have remained split—over the value and safety of COVID‑19 vaccines. Anti-vaccine beliefs started on the fringe, but they spread to the point where Ron DeSantis, the governor of the country’s third-most-populous state, launched a campaign for president on an appeal to anti-vaccine ideology.

Five years later, one side has seemingly triumphed. The winner is not the side that initially prevailed, the side of public safety. The winner is the side that minimized the disease, then rejected public-health measures to prevent its spread, and finally refused the vaccines designed to protect against its worst effects.

[David A. Graham: The noisy minority]

Ahead of COVID’s fifth anniversary, Trump, as president-elect, nominated the country’s most outspoken vaccination opponent to head the Department of Health and Human Services. He chose a proponent of the debunked and discredited vaccines-cause-autism claim to lead the CDC. He named a strident critic of COVID‑vaccine mandates to lead the FDA. For surgeon general, he picked a believer in hydroxychloroquine, the disproven COVID‑19 remedy. His pick for director of the National Institutes of Health had advocated for letting COVID spread unchecked to encourage herd immunity. Despite having fast-tracked the development of the vaccines as president, Trump has himself trafficked in many forms of COVID‑19 denial, and has expressed his own suspicions that childhood vaccination against measles and mumps is a cause of autism.

The ascendancy of the anti-vaxxers may ultimately prove fleeting. But if the forces of science and health are to stage a comeback, it’s important to understand why those forces have gone into eclipse.

From March 2020 to February 2022, about 1 million Americans died of COVID-19. Many of those deaths occurred after vaccines became available. If every adult in the United States had received two doses of a COVID vaccine by early 2022, rather than just the 64 percent of adults who had, nearly 320,000 lives would have been saved.

[From the January/February 2021 issue: Ed Yong on how science beat the virus]

Why did so many Americans resist vaccines? Perhaps the biggest reason was that the pandemic coincided with a presidential-election year, and Trump instantly recognized the crisis as a threat to his chances for reelection. He responded by denying the seriousness of the pandemic, promising that the disease would rapidly disappear on its own, and promoting quack cures.

The COVID‑19 vaccines were developed while Trump was president. They could have been advertised as a Trump achievement. But by the time they became widely available, Trump was out of office. His supporters had already made up their minds to distrust the public-health authorities that promoted the vaccines. Now they had an additional incentive: Any benefit from vaccination would redound to Trump’s successor, Joe Biden. Vaccine rejection became a badge of group loyalty, one that ultimately cost many lives.

A summer 2023 study by Yale researchers of voters in Florida and Ohio found that during the early phase of the pandemic, self-identified Republicans died at only a slightly higher rate than self-identified Democrats in the same age range. But once vaccines were introduced, Republicans became much more likely to die than Democrats. In the spring of 2021, the excess-death rate among Florida and Ohio Republicans was 43 percent higher than among Florida and Ohio Democrats in the same age range. By the late winter of 2023, the 300-odd most pro-Trump counties in the country had a COVID‑19 death rate more than two and a half times higher than the 300 or so most anti-Trump counties.

In 2016, Trump had boasted that he could shoot a man on Fifth Avenue and not lose any votes. In 2021 and 2022, his most fervent supporters risked death to prove their loyalty to Trump and his cause.

Why did political fidelity express itself in such self-harming ways?

The onset of the pandemic was an unusually confusing and disorienting event. Some people who got COVID died. Others lived. Some suffered only mild symptoms. Others spent weeks on ventilators, or emerged with long COVID and never fully recovered. Some lost businesses built over a lifetime. Others refinanced their homes with 2 percent interest rates and banked the savings.

We live in an impersonal universe, indifferent to our hopes and wishes, subject to extreme randomness. We don’t like this at all. We crave satisfying explanations. We want to believe that somebody is in control, even if it’s somebody we don’t like. At least that way, we can blame bad events on bad people. This is the eternal appeal of conspiracy theories. How did this happen? Somebody must have done it—but who? And why?

Compounding the disorientation, the coronavirus outbreak was a rapidly changing story. The scientists who researched COVID‑19 knew more in April 2020 than they did in February; more in August than in April; more in 2021 than in 2020; more in 2022 than in 2021. The official advice kept changing: Stay inside—no, go outside. Wash your hands—no, mask your face. Some Americans appreciated and accepted that knowledge improves over time, that more will be known about a new disease in month two than in month one. But not all Americans saw the world that way. They mistrusted the idea of knowledge as a developing process. Such Americans wondered: Were they lying before? Or are they lying now?

In a different era, Americans might have deferred more to medical authority. The internet has upended old ideas of what should count as authority and who possesses it.

The pandemic reduced normal human interactions. Severed from one another, Americans deepened their parasocial attachment to social-media platforms, which foment alienation and rage. Hundreds of thousands of people plunged into an alternate mental universe during COVID‑19 lockdowns. When their doors reopened, the mania did not recede. Conspiracies and mistrust of the establishment—never strangers to the American mind—had been nourished, and they grew.

The experts themselves contributed to this loss of trust.

It’s now agreed that we had little to fear from going outside in dispersed groups. But that was not the state of knowledge in the spring of 2020. At the time, medical experts insisted that any kind of mass outdoor event must be sacrificed to the imperatives of the emergency. In mid-March 2020, federal public-health authorities shut down some of Florida’s beaches. In California, surfers faced heavy fines for venturing into the ocean. Even the COVID‑skeptical Trump White House reluctantly canceled the April 2020 Easter-egg roll.

And then the experts abruptly reversed themselves. When George Floyd was choked to death by a Minneapolis police officer on May 25, 2020, hundreds of thousands of Americans left their homes to protest, defying three months of urgings to avoid large gatherings of all kinds, outdoor as well as indoor.

On May 29, the American Public Health Association issued a statement that proclaimed racism a public-health crisis while conspicuously refusing to condemn the sudden defiance of public-safety rules.

The next few weeks saw the largest mass protests in recent U.S. history. Approximately 15 million to 26 million people attended outdoor Black Lives Matter events in June 2020, according to a series of reputable polls. Few, if any, scientists or doctors scolded the attendees—and many politicians joined the protests, including future Vice President Kamala Harris. It all raised a suspicion: Maybe the authorities were making the rules based on politics, not science.

The politicization of health advice became even more consequential as the summer of 2020 ended. Most American public schools had closed in March. “At their peak,” Education Week reported, “the closures affected at least 55.1 million students in 124,000 U.S. public and private schools.” By September, it was already apparent that COVID‑19 posed relatively little risk to children and teenagers, and that remote learning did not work. At the same time, returning to the classroom before vaccines were available could pose some risk to teachers’ health—and possibly also to the health of the adults to whom the children returned after school.

[David Frum: I moved to Canada during the pandemic]

How to balance these concerns given the imperfect information? Liberal states decided in favor of the teachers. In California, the majority of students did not return to in-person learning until the fall of 2021. New Jersey kept many of its public schools closed until then as well. Similar things happened in many other states: Illinois, Maryland, New York, and so on, through the states that voted Democratic in November 2020.

Florida, by contrast, reopened most schools in the fall of 2020. Texas soon followed, as did most other Republican-governed states. The COVID risk for students, it turned out, was minimal: According to a 2021 CDC study, less than 1 percent of Florida students contracted COVID-19 in school settings from August to December 2020 after their state restarted in-person learning. Over the 2020–21 school year, students in states that voted for Trump in the 2020 election got an average of almost twice as much in-person instruction as students in states that voted for Biden.

Any risks to teachers and school staff could have been mitigated by the universal vaccination of those groups. But deep into the fall of 2021, thousands of blue-state teachers and staff resisted vaccine mandates—including more than 5,000 in Chicago alone. By then, another school year had been interrupted by closures.

By disparaging public-health methods and discrediting vaccines, the COVID‑19 minimizers cost hundreds of thousands of people their lives. By keeping schools closed longer than absolutely necessary, the COVID maximizers hazarded the futures of young Americans.

Students from poor and troubled families, in particular, will continue to pay the cost of these learning losses for years to come. Even in liberal states, many private schools reopened for in-person instruction in the fall of 2020. The affluent and the connected could buy their children a continuing education unavailable to those who depended on public schools. Many lower-income students did not return to the classroom: Throughout the 2022–23 school year, poorer school districts reported much higher absenteeism rates than were seen before the pandemic.

Teens absent from school typically get into trouble in ways that are even more damaging than the loss of math or reading skills. New York City arrested 25 percent more minors for serious crimes in 2024 than in 2018. The national trend was similar, if less stark. The FBI reports that although crime in general declined in 2023 compared with 2022, crimes by minors rose by nearly 10 percent.

People who finish schooling during a recession tend to do worse even into middle age than those who finish in times of prosperity. They are less likely to marry, less likely to have children, and more likely to die early. The disparity between those who finish in lucky years and those who finish in unlucky years is greatest for people with the least formal education.

Will the harms of COVID prove equally enduring? We won’t know for some time. But if past experience holds, the COVID‑19 years will mark their most vulnerable victims for decades.

The story of COVID can be told as one of shocks and disturbances that wrecked two presidencies. In 2020 and 2024, incumbent administrations lost elections back-to-back, something that hadn’t happened since the deep economic depression of the late 1880s and early 1890s. The pandemic caused a recession as steep as any in U.S. history. The aftermath saw the worst inflation in half a century.

In the three years from January 2020 through December 2022, Trump and Biden both signed a series of major bills to revive and rebuild the U.S. economy. Altogether, they swelled the gross public debt from about $20 billion in January 2017 to nearly $36 billion today. The weight of that debt helped drive interest rates and mortgage rates higher. The burden of the pandemic debt, like learning losses, is likely to be with us for quite a long time.

Yet even while acknowledging all that went wrong, respecting all the lives lost or ruined, reckoning with all the lasting harms of the crisis, we do a dangerous injustice if we remember the story of COVID solely as a story of American failure. In truth, the story is one of strength and resilience.

Scientists did deliver vaccines to prevent the disease and treatments to recover from it. Economic policy did avert a global depression and did rapidly restore economic growth. Government assistance kept households afloat when the world shut down—and new remote-work practices enabled new patterns of freedom and happiness after the pandemic ended.

The virus was first detected in December 2019. Its genome was sequenced within days by scientists collaborating across international borders. Clinical trials for the Pfizer-BioNTech vaccine began in April 2020, and the vaccine was authorized for emergency use by the FDA in December. Additional vaccines rapidly followed, and were universally available by the spring of 2021. The weekly death toll fell by more than 90 percent from January 2021 to midsummer of that year.

The U.S. economy roared back with a strength and power that stunned the world. The initial spike of inflation has subsided. Wages are again rising faster than prices. Growth in the United States in 2023 and 2024 was faster and broader than in any peer economy.

Even more startling, the U.S. recovery outpaced China’s. That nation’s bounceback from COVID‑19 has been slow and faltering. America’s economic lead over China, once thought to be narrowing, has suddenly widened; the gap between the two countries’ GDPs grew from $5 trillion in 2021 to nearly $10 trillion in 2023. The U.S. share of world economic output is now slightly higher than it was in 1980, before China began any of its economic reforms. As he did in 2016, Trump inherits a strong and healthy economy, to which his own reckless policies—notably, his trade protectionism—are the only visible threat.

In public affairs, our bias is usually to pay most attention to disappointments and mistakes. In the pandemic, there were many errors: the partisan dogma of the COVID minimizers; the capitulation of states and municipalities to favored interest groups; the hypochondria and neuroticism of some COVID maximizers. Errors need to be studied and the lessons heeded if we are to do better next time. But if we fail to acknowledge America’s successes—even partial and imperfect successes—we not only do an injustice to the American people. We also defeat in advance their confidence to collectively meet the crises of tomorrow.

Perhaps it’s time for some national self-forgiveness here. Perhaps it’s time to accept that despite all that went wrong, despite how much there was to learn about the disease and how little time there was to learn it, and despite polarized politics and an unruly national character—despite all of that—Americans collectively met the COVID‑19 emergency about as well as could reasonably have been hoped.

The wrong people have profited from the immediate aftermath. But if we remember the pandemic accurately, the future will belong to those who rose to the crisis when their country needed them.

This article appears in the March 2025 print edition with the headline “Why the COVID Deniers Won.”