Itemoids

Philadelphia

America Wouldn’t Know the Worst of a Vaccine Decline Until It’s Too Late

The Atlantic

www.theatlantic.com › health › archive › 2025 › 01 › rfk-jr-vaccine-decline › 681489

Becoming a public-health expert means learning how to envision humanity’s worst-case scenarios for infectious disease. For decades, though, no one in the U.S. has had to consider the full danger of some of history’s most devastating pathogens. Widespread vaccination has eliminated several diseases—among them, measles, polio, and rubella—from the country, and kept more than a dozen others under control. But in the past few years, as childhood-vaccination rates have dipped nationwide, some of infectious disease’s ugliest hypotheticals have started to seem once again plausible.

The new Trump administration has only made the outlook more tenuous. Should Robert F. Kennedy Jr., one of the nation’s most prominent anti-vaccine activists, be confirmed as the next secretary of Health and Human Services, for instance, his actions could make a future in which diseases resurge in America that much more likely. His new position would grant him substantial power over the FDA and the CDC, and he is reportedly weighing plans—including one to axe a key vaccine advisory committee—that could prompt health-care providers to offer fewer shots to kids, and inspire states to repeal mandates for immunizations in schools. (Kennedy’s press team did not respond to a request for comment.)

Kennedy’s goal, as he has said, is to offer people more choice, and many Americans likely would still enthusiastically seek out vaccines. Most Americans support childhood vaccination and vaccine requirements for schools; a KFF poll released today found, though, that even in the past year the proportion of parents who say they skipped or delayed shots for their children has risen, to one in six. The more individuals who choose to eschew vaccination, the closer those decisions would bring society’s collective defenses to cracking. The most visceral effects might not be obvious right away. For some viruses and bacteria to break through, the country’s immunization rates may need to slip quite a bit. But for others, the gap between no outbreak and outbreak is uncomfortably small. The dozen experts I spoke with for this story were confident in their pessimism about how rapidly epidemics might begin.

[Read: How America’s fire wall against disease starts to fail]

Paul Offit, a pediatrician at Children’s Hospital of Philadelphia and co-inventor of one of the two rotavirus vaccines available in the U.S., needs only to look at his own family to see the potential consequences. His parents were born into the era of the deadly airway disease diphtheria; he himself had measles, mumps, rubella, and chickenpox, and risked contracting polio. Vaccination meant that his own kids didn’t have to deal with any of these diseases. But were immunization rates to fall too far, his children’s children very well could. Unlike past outbreaks, those future epidemics would sweep across a country that, having been free of these diseases for so long, is no longer equipped to fight them.

“Yeah,” Offit said when I asked him to paint a portrait of a less vaccinated United States. “Let’s go into the abyss.”

Should vaccination rates drop across the board, one of the first diseases to be resurrected would almost certainly be measles. Experts widely regard the viral illness, which spreads through the air, as the most infectious known pathogen. Before the measles vaccine became available in 1963, the virus struck an estimated 3 million to 4 million Americans each year, about 1,000 of whom would suffer serious swelling of the brain and roughly 400 to 500 of whom would die. Many survivors had permanent brain damage. Measles can also suppress the immune system for years, leaving people susceptible to other infections.

Vaccination was key to ridding the U.S. of measles, declared eliminated here in 2000. And very high rates of immunity—about 95 percent vaccine coverage, experts estimate—are necessary to keep the virus out. “Just a slight dip in that is enough to start spurring outbreaks,” Boghuma Kabisen Titanji, an infectious-disease physician at Emory University, told me. Which has been exactly the case. Measles outbreaks do still occur in American communities where vaccination rates are particularly low, and as more kids have missed their MMR shots in recent years, the virus has found those openings. The 16 measles outbreaks documented in the U.S. in 2024 made last year one of the country’s worst for measles since the turn of the millennium.

But for all measles’ speed, “I would place a bet on whooping cough being first,” Samuel Scarpino, an infectious-disease modeler at Northeastern University, told me. The bacterial disease can trigger months of coughing fits violent enough to fracture ribs. Its severest consequences include pneumonia, convulsions, and brain damage. Although slower to transmit than measles, it has never been eliminated from the U.S., so it’s poised for rampant spread. Chickenpox poses a similar problem. Although corralled by an effective vaccine in the 1990s, the highly contagious virus still percolates at low levels through the country. Plenty of today’s parents might still remember the itchy blisters it causes as a rite of passage, but the disease’s rarer complications can be as serious as sepsis, uncontrolled bleeding, and bacterial infections known as “flesh-eating disease.” And the disease is much more serious in older adults.

Those are only some of the diseases the U.S. could have to deal with. Kids who get all of the vaccines routinely recommended in childhood are protected against 16 diseases—each of which would have some probability of making a substantial comeback, should uptake keep faltering. Perhaps rubella would return, infecting pregnant women, whose children could be born blind or with heart defects. Maybe meningococcal disease, pneumococcal disease, or Haemophilus influenzae disease, each caused by bacteria commonly found in the airway, would skyrocket, and with them rates of meningitis and pneumonia. The typical ailments of childhood—day-care colds, strep throat, winter norovirus waves—would be joined by less familiar and often far more terrifying problems: the painful, swollen necks of mumps; the parching diarrhea of rotavirus; the convulsions of tetanus. For far too many of these illnesses, “the only protection we have,” Stanley Plotkin, a vaccine expert and one of the developers of the rubella vaccine, told me, “is a vaccine.”

Exactly how and when outbreaks of these various diseases could play out—if they do at all—is impossible to predict. Vaccination rates likely wouldn’t fall uniformly across geographies and demographics. They also wouldn’t decrease linearly, or even quickly. People might more readily refuse vaccines that were developed more recently and have been politicized (think HPV or COVID shots). And existing immunity could, for a time, still buffer against an infectious deluge, especially from pathogens that remain quite rare globally. Polio, for instance, would be harder than measles to reestablish in the United States: It was declared eliminated from the Americas in the 1990s, and remains endemic to only two countries. This could lead to a false impression that declining vaccination rates have little impact.

A drop in vaccination rates, after all, doesn’t guarantee an outbreak—a pathogen must first find a vulnerable population. This type of chance meeting could take years. Then again, infiltrations might not take long in a world interconnected by travel. The population of this country is also more susceptible to disease than it has been in past decades. Americans are, on average, older; obesity rates are at a historical high. The advent of organ transplants and cancer treatments has meant that a substantial sector of the population is immunocompromised; many other Americans are chronically ill. Some of these individuals don’t mount protective responses to vaccinations at all, which leaves them reliant on immunity in others to keep dangerous diseases at bay.

If various viruses and bacteria began to recirculate in earnest, the chance of falling ill would increase even for healthy, vaccinated adults. Vaccines don’t offer comprehensive or permanent protection, and the more pathogen around, the greater its chance of breaking through any one person’s defenses. Immunity against mumps and whooping cough is incomplete, and known to wane in the years after vaccination. And although immunity generated by the measles vaccine is generally thought to be quite durable, experts can’t say for certain how durable, Bill Hanage, an infectious-disease epidemiologist at Harvard’s School of Public Health, told me: The only true measure would be to watch the virus tear through a population that hasn’t dealt with it in decades.

Perhaps the most unsettling feature of a less vaccinated future, though, is how unprepared the U.S. is to confront a resurgence of pathogens. Most health-care providers in the country no longer have the practical knowledge to diagnose and treat diseases such as measles and polio, Kathryn Edwards, a pediatrician at Vanderbilt University, told me: They haven’t needed it. Many pediatricians have never even seen chickenpox outside of a textbook.

To catch up, health-care providers would need to familiarize themselves with signs and symptoms they may have seen only in old textbooks or in photographs. Hospitals would need to use diagnostic tests that haven’t been routine in years. Some of those tools might be woefully out of date, because pathogens have evolved; antibiotic resistance could also make certain bacterial infections more difficult to expunge than in decades prior. And some protocols may feel counterintuitive, Offit said: The ultra-contagiousness of measles could warrant kids with milder cases being kept out of health-care settings, and kids with Haemophilus influenzae might need to be transported to the hospital without an ambulance, to minimize the chances that the stress and cacophony would trigger a potentially lethal spasm.

[Read: Here’s how we know RFK Jr. is wrong about vaccines]

The learning curve would be steep, Titanji said, stymieing care for the sick. The pediatric workforce, already shrinking, might struggle to meet the onslaught, leaving kids—the most likely victims of future outbreaks—particularly susceptible, Sallie Permar, the chief pediatrician at NewYork–Presbyterian/Weill Cornell Medical Center, told me. If already overstretched health-care workers were further burdened, they’d be more likely to miss infections early on, making those cases more difficult to treat. And if epidemiologists had to keep tabs on more pathogens, they’d have less capacity to track any single infectious disease, making it easier for one to silently spread.

The larger outbreaks grow, the more difficult they are to contain. Eventually, measles could once again become endemic in the U.S. Polio could soon follow suit, imperiling the fight to eradicate the disease globally, Virginia Pitzer, an infectious-disease epidemiologist at Yale, told me. In a dire scenario—the deepest depths of the abyss—average lifespans in the U.S. could decline, as older people more often fall sick, and more children under 5 die. Rebottling many of these diseases would be a monumental task. Measles was brought to heel in the U.S. only by decades of near-comprehensive vaccination; re-eliminating it from the country would require the same. But the job this time would be different, and arguably harder—not merely coaxing people into accepting a new vaccine, but persuading them to take one that they’ve opted out of.

That future is by no means guaranteed—especially if Americans recall what is at stake. Many people in this country are too young to remember the cost these diseases exacted. But Edwards, who has been a pediatrician for 50 years, is not. As a young girl, she watched a childhood acquaintance be disabled by polio. She still vividly recalls patients she lost to meningitis decades ago. The later stages of her career have involved fewer spinal taps, fewer amputations. Because of vaccines, the job of caring for children, nowadays, simply involves far less death.

Whole Foods workers make history by voting to unionize

Quartz

qz.com › whole-foods-workers-union-amazon-philadelphia-1851749148

This story incorporates reporting fromThe HR Digest, New York Post, The New York Times and Bicycle Retailer And Industry News.

Workers at a major Whole Foods store in Philadelphia have voted in favor of unionizing. This victory makes them the first within the Amazon-owned grocery chain to successfully form a union.…

Read more...

Barkley stuns Commanders with 60-yard touchdown

BBC News

www.bbc.com › sport › american-football › videos › c334r7p7k50o

Philadelphia Eagles running back Saquon Barkley stuns the Washington Commanders with a touchdown on his first drive as the Eagles advance to the Super Bowl for the second time in three years.

David Lynch Was America’s Cinematic Poet

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 01 › david-lynch-death-career › 681347

David Lynch died yesterday at the age of 78, after a career that made him perhaps the most consequential American art filmmaker in the history of the medium. But his singular voice extended far beyond cinema, into television, music, internet fame, coffee making, furniture design, transcendental meditation, and practically any other creative endeavor you can imagine. He was a brand, though a fiercely independent one: Beginning with his debut movie, Eraserhead, in 1977, Lynch became the rare kind of artist whose last name seemed to describe an entire genre. He established a style that offered an otherworldly reckoning with our way of life, incorporating classic Hollywood storytelling, pulpy romanticism, and abstract surrealism all at once.

Lynch’s canon was so tremendous that each of his many fans and acolytes  likely had different entry points into it. There was the aggressive midnight-screening oddness of Eraserhead in the 1970s; the frightening mix of throwback folksiness and depraved sexuality in Blue Velvet in the 1980s; and the bizarre-but-incredible TV phenomenon that was Twin Peaks in the early 1990s. Others found him through 2001’s Mulholland Drive, a staggering collision of Hollywood dreamscapes, or 2017’s inimitable Twin Peaks: The Return, which exploded the form of “prestige television” that its predecessor had helped plant the seeds for. These are just a few of Lynch’s achievements in a body of work that spanned big-budget and micro-budget, highbrow and low. His output was also defined by his personal celebrity—a folksy, chain-smoking former Eagle Scout who produced art of high complexity while also rhapsodizing about the simple pleasures of eating a donut with a cup of coffee.

The first Lynch film I saw in a theater was Mulholland Drive, at the age of 15. A budding cinephile, I was only somewhat aware of the director’s titanic reputation and of the movie’s circuitous journey to the screen. (It was initially intended as a television pilot, a Twin Peaks successor that ABC ultimately rejected.) Mulholland Drive was an artistic thunderbolt like no other for me, and watching it for the first time is still probably the most transformative experience I’ve ever had in a cinema. I can palpably recall my terror during the early sequence at Winkie’s Diner, in which two men discuss a dream one had involving some ineffable monster out back, and the transfixing mystery of Club Silencio, one of Lynch’s many on-screen environments that seemed to have a foot in multiple realities. The film was at certain times a chilling representation of fear, trauma, and death, but at others hauntingly lovely and funny. It opened my eyes to what movies could be, beyond just the entertaining product they usually were.

[Read: How Twin Peaks invented modern television]

Mulholland Drive resisted easy explanation, as did all of the director’s stories. But, boiled down, many had a sweet purity to them, involving battles of good and evil and harsh realities endured by pure spirits. The director had a charmed and normal childhood, by all accounts; he was born in Montana but moved all over the country as a kid, living in Washington, North Carolina, Idaho, and Virginia at various points. Still, he would later recall moments that punctured that idyll. “When I was little, my brother and I were outdoors late one night, and we saw a naked woman come walking down the street toward us in a dazed state, crying. I have never forgotten that moment,” he once told Roger Ebert, evoking an image that would serve as Blue Velvet’s centerpiece many years after the fact.

More adult life events inspired his first feature, however. A quiet, eccentric, ink-black comedy about a peculiar young man who works at a factory in an industrial dystopia, Eraserhead is plainly Lynch’s way of processing his life as an early parent in Philadelphia. Its protagonist struggles to raise a mutant creature while also dealing with nattering in-laws and a mundane job. Most theatergoers were likely to find the film off-putting—what with its clanking, abrasive soundtrack, beautifully cloying interludes of simple songs, and unabashedly nonnarrative strangeness. Eraserhead could have died in obscurity, but it became a cult-movie sensation instead, the kind that circulates among artsy gatherings, comic-book shops, and other underground scenes, as much of Lynch’s filmography now does.

The veteran comedian and filmmaker Mel Brooks saw the movie and, somehow, it resonated with him. He then hired Lynch—over far more objectively qualified, well-known names—to direct a project that Brooks had been nurturing, The Elephant Man. It was a critical smash that landed several Oscar nominations, and Lynch’s industry ascension seemed set. His follow-up was the sci-fi epic Dune, an adaptation of the blockbuster Frank Herbert novel, for which Lynch claimed he had passed on Return of the Jedi. But it was an artistically compromised box-office failure; the director never made a big-budget film again. He instead found greater success once he’d swerved back to his more personal fascinations: His next film was the alternately astonishing and repellant Blue Velvet, a nasty noir fairytale of gangsters and abuse in a picture-perfect suburban town.

[Read: David Lynch's unfathomable masterpiece]

Lynch took many, many creative risks over the years, but Blue Velvet is the movie that perhaps best melded grim violence and white-picket-fence cheerfulness—a vision that came to characterize him in the public eye. The director continued to dig beneath idealism’s rot for the remainder of his career, and the 1990 premiere of Twin Peaks brought his worldview to a broader swath of viewers. Co-created by the writer Mark Frost, the ABC show was an uncanny soap opera, powered by a murder mystery that briefly captured the country’s imagination. Twin Peaks ran out of ratings steam quickly over the course of its initial, two-season run, but it’s since emerged as Lynch’s quintessential work. The series’ legacy was powered by both its empathy—the stark and sincere emotion the director could deploy so beautifully—and the way it transformed between various media over time. Twin Peaks evolved into a larger, decades-spanning project, encompassing the aggressively tragic and beautiful prequel film, Fire Walk With Me, in 1992, and the confounding, hilarious, and formally defiant sequel show, The Return, which premiered 25 years later.

In his later life, Lynch charged into the digital frontier in his typically singular fashion. He used grainy digital video cameras to shoot the bizarre California epic Inland Empire mostly on his own dime; he uploaded original, offbeat episodic projects and crudely animated cartoons exclusively for subscribers to his website. The director was an excellent marketer of himself, despite his preference for alienating themes and aesthetic choices: His trademark non-sequitur-filled humor and rambling sincerity connected both him and his oeuvre to generation after generation. Lynch, more than many of his peers, could expose audiences to the harshest, most discomforting imagery while also balefully commanding them to “fix their hearts or die.” If the American experience had a cinematic poet, it was him. The news that Lynch had left us was shocking only because it seemed that he’d be here with us forever.

Biden’s Tarnished Legacy

The Atlantic

www.theatlantic.com › newsletters › archive › 2025 › 01 › bidens-tarnished-legacy › 681267

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

President Joe Biden still imagines that he could have won. Asked by USA Today’s Susan Page whether he could have beaten Donald Trump if he had stayed in the race, Biden responded: “It’s presumptuous to say that, but I think yes.”

Reality thinks not.

Of course, we’ll never know for sure, but the evidence (including polling) suggests that he would have been crushed by an even larger margin than Kamala Harris was. Biden’s answer is a reminder that his legacy will be tarnished by his fundamental misreading of the moment and his own role in it.

To be sure, Biden can point to some impressive successes. He leaves behind a healthy and growing economy, a record of legislative accomplishment, and more than 230 judicial appointments, including a Supreme Court justice. And then there were the failures: the chaotic exit from Afghanistan; a massive surge of migrants at the border in 2023. Although Biden was not solely to blame for inflation—factors included the Federal Reserve’s low-interest-rate policy and Russia’s invasion of Ukraine—his spending policies contributed to the problem. And even though he rallied Europe to the defense of Ukraine, critics suggest that he also misread that moment—Phillips Payson O’Brien argued in The Atlantic in November that the Biden administration “treated the conflict like a crisis to be managed, not a war to be won.” Ukraine’s uncertain fate is now left to Biden’s successor.

A charismatic and energetic president might have been able to overcome these failures and win a run for reelection. Some presidents seize the public’s imagination; Biden barely even got its attention. He presumed that he could return to a Before Times style of politics, where the president was a backroom bipartisan dealmaker. Whereas Trump dominated the news, Biden seemed to fade into the background almost from the beginning, seldom using his bully pulpit to rally public support or explain his vision for the country. Trump was always in our faces, but it often felt like Biden was … elsewhere.

Biden also misread the trajectory of Trumpism. Like so many others, he thought that the problem of Trump had taken care of itself and that his election meant a return to normalcy. So he chose as his attorney general Merrick Garland, who seems to have seen his role as restoring the Department of Justice rather than pursuing accountability for the man who’d tried to overturn the election. Eventually, Garland turned the cases over to Special Counsel Jack Smith, who brought indictments. But it was too late. With time running out and a Supreme Court ruling in favor of broad presidential immunity, Trump emerged unscathed. And then came the sad final chapter of Biden’s presidency, which may well overshadow everything else.

When he ran for president in 2020, Biden described himself as a “transition candidate” and a “bridge” to a new generation of leaders. But instead of stepping aside for those younger leaders, Biden chose to seek another term, despite the growing evidence of his decline. With the future of democracy at stake, Biden’s inner circle appeared to shield the octogenarian president. His team didn’t just insist that voters ignore what was in front of their eyes; it also maintained that the aging president could serve out another four-year term. Some Democrats clung to denial—and shouted down internal critics—until Biden’s disastrous debate performance put an end to the charade.

Even then, Biden stubbornly tried to hang on, before intense pressure from his own party forced him to drop out of the race in July. Now he is shuffling to the end of his presidency, already shunted aside by his successor and still in denial.

As the passing of Jimmy Carter reminds us, presidential legacies are complicated matters, and it is difficult to predict the verdict of history. But as Biden leaves office, he is less a transformational figure than a historical parenthesis. He failed to grasp both the political moment and the essential mission of his presidency.

Other presidents have misunderstood their mandate. But in Biden’s case, the consequences were existential: By his own logic, the Prime Directive of his presidency was to preserve democracy by preventing Donald Trump’s return to power. His failure to do so will likely be the lasting legacy of his four years in office.

Related:

Biden’s unpardonable hypocrisy How Biden made a mess of Ukraine

Here are three new stories from The Atlantic:

The army of God comes out of the shadows. “The Palisades Fire is destroying places that I’ve loved.” Why “late regime” presidencies fail

Today’s News

Former President Jimmy Carter’s state funeral took place in Washington, D.C. Carter’s casket was flown to Georgia after; he will be buried in his hometown of Plains. At least five people are dead in the wildfires that have spread across parts of the Los Angeles area. More than 2,000 structures have been damaged or destroyed. New York’s highest court denied Donald Trump’s request to halt the sentencing hearing in his criminal hush-money case.

Dispatches

Time-Travel Thursdays: Early-career poetry often poses a tantalizing question: How did this poet start off so terrible—and end up so good? But a writer’s final works are compelling for a different reason, Walt Hunter writes.

Explore all of our newsletters here.

Evening Read

Illustration by Jan Buchczik

You’re Going to Die. That’s a Good Thing.

By Arthur C. Brooks

Death is inevitable, of course; the most ordinary aspect of life is that it ends. And yet, the prospect of that ending feels so foreign and frightening to us. The American anthropologist Ernest Becker explored this strangeness in his 1973 book, The Denial of Death, which led to the development by other scholars of “terror management theory.” This theory argues that we fill our lives with pastimes and distractions precisely to avoid dealing with death …

If we could resolve this dissonance and accept reality, wouldn’t life be better? The answer is most definitely yes.

Read the full article.

More From The Atlantic

When the flames come for you Trump is poised to turn the DOJ into his personal law firm. The Solzhenitsyn test Public health can’t stop making the same nutrition mistake. A virtual cell is a “holy grail” of science. It’s getting closer.

Culture Break

Gilles Mingasson / Disney

Watch. Abbott Elementary and It’s Always Sunny in Philadelphia don’t have much common ground. That’s why their first crossover episode (available on Hulu) felt so fresh, Hannah Giorgis writes.

Explore. Why do so many people hate winter? Research suggests that there are two kinds of people who tolerate the cold very well, Olga Khazan wrote in 2018.

Play our daily crossword.

Stephanie Bai contributed to this newsletter.

When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic.

The Payoff of TV’s Most Awaited Crossover

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 01 › abbott-elementary-its-always-sunny-in-philadelphia-crossover-review › 681249

On Abbott Elementary, celebrity sightings are as common as a back-to-school flu outbreak or drama with the PTA. The show’s Season 2 premiere kicked off with the spunky second-grade teacher Janine Teagues (played by Quinta Brunson) trying to surprise Abbott students with an appearance from “the only celebrity that matters”: Gritty, the internet-famous mascot for the Philadelphia Flyers. In Season 3, Bradley Cooper joined a class for show-and-tell, the Philadelphia Eagles star Jalen Hurts tried to help a teacher’s boyfriend propose, and Questlove DJed a party in the school gym.

As on many a network sitcom, Abbott’s celebrity cameos tend to involve the stars playing themselves, with some embellished biographical details to sweeten their stories. (Questlove, for example, claimed that he and Allen Iverson both credit their illustrious careers to Abbott’s principal, who happens to be one of their closest friends.) Now, midway through its fourth season, Abbott has found a clever way to continue celebrating that hometown pride—and expand the show’s comedic arsenal. The latest episode taps some of Philly’s most well-known fictional personalities, using their outlandish antics to draw out a bit more edge from Abbott’s plucky educators.

In tonight’s episode, the main characters of It’s Always Sunny in Philadelphia saunter into the public school and invigorate the mockumentary by stirring up chaos. Anyone familiar with the long-running FX sitcom about a group of bartenders knows that the Sunny protagonists don’t belong anywhere near an elementary-school campus. Throughout its 16 seasons, the most of any live-action American comedy series, It’s Always Sunny has been a riotous, foul-mouthed chronicle of escalating misbehavior from a gang of total miscreants. The loosely plotted sitcom has followed the Paddy’s Pub slackers through outrageous, ill-conceived schemes that almost always reveal just how craven they are: They’ve smoked crack in an attempt to exploit the welfare system, siphoned gas to sell door-to-door, and outlined some deeply concerning strategies for picking up women.

Suffice it to say, none of them is getting invited to speak at a commencement ceremony or Career Day. By contrast, most of the strangers who’ve popped up at Abbott over the years, whether they’re district bureaucrats or local businesspeople, at least pretend to have altruistic motives. When these visitors cause issues for the school, it’s usually due to incompetence, negligence, or an easily resolved misunderstanding. And of course, there’s generally a moral at the end of the story—the kind of humorous, heartfelt fare that makes Abbott so beloved as family viewing.

[Read: Abbott Elementary lets Black kids be kids]

But things go awry almost immediately after the Sunny squad shows up in “Volunteers,” the first of two planned crossover episodes. The gang arrives at Abbott under the guise of offering the overworked educators some much needed help from the local school district. Instead, Mac (Rob McElhenney), Charlie (Charlie Day), Dennis (Glenn Howerton), Frank (Danny DeVito), and Deandra (Kaitlin Olson) quickly discover that there are documentary cameras rolling at Abbott, prompting the superlatively toxic Dennis to excuse himself because he knows “quite a bit about filming and consent.” The others stick around, acting slightly more buttoned-up than usual because they know they’re being recorded, but they’re still too abrasive to fit in. They admit that they’re there only to satisfy the community-service requirements of a court order, and in response to one teacher calling them criminals, ask whether it’s really a “crime” to dump 100 gallons of baby oil, 500 Paddy’s Pub T-shirts, and a Cybertruck in the Schuylkill River.

These kinds of ludicrous scenarios are par for the course on Sunny, but they strain the boundaries of the malfeasance we usually see from Abbott characters. For the educators, that creates an amusing challenge: The Sunny gang isn’t a pack of wayward teenagers waiting for an understanding mentor to show them the light, and their moral failures can’t be rehabilitated with a pep talk. No earnest, well-articulated argument for the importance of early-childhood education will make characters like these abandon their selfishness, and the unexpected dose of cynicism gives Abbott’s formula an intriguing mid-season shake-up—a nice wrinkle, considering how many network sitcoms begin to feel repetitive the longer they stay on the air.

Take the drama caused by Deandra, or “Sweet Dee.” This episode finds the lone woman in the main Sunny crew initially bonding with Janine while volunteering in her classroom: Dee praises Janine in front of the second graders after the two women realize they both attended the University of Pennsylvania. But their camaraderie takes a hit when Dee starts lusting after Gregory (Tyler James Williams), Janine’s fellow teacher—and, after a lengthy will-they-won’t-they storyline, also her boyfriend. When Janine tells Dee that she’s in a relationship with Gregory, the Sunny transplant is undeterred: “You’re good if I take a spin though, yeah?” It’s the first time Janine’s encountered a real romantic foil on the series, and as the conflict plays out, Dee’s brash flirting style forces Janine to acknowledge her fears about the relationship. These scenes offer Janine, easily the most childlike of the teachers, an opportunity to grow by facing the tension head-on—a feat made easier by her having a farcical villain in Dee.

Abbott will never be the kind of show where the main cast routinely has to fend off mean-spirited romantic sabotage or keep tabs on a man who gives off serious Andrew Tate vibes. After the volunteers slink back to Paddy’s, the most shiftless person on campus will once again be Principal Coleman (Janelle James), whose ineptitude and vanity don’t prevent her from advocating for the students from time to time. Still, the Sunny crossover episode marks a compelling chapter in Abbott’s evolution. The series has stayed family-friendly thanks to its educational setting, showcasing the comic talents of both its students and teachers. But Abbott is now proving itself adept at something different too: comedy with a real bite, even if it’s not in service of teaching a lesson.

The Anti-Social Century

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 02 › american-loneliness-personality-politics › 681091

This story seems to be about:

Illustrations by Max Guther

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

The Bar Is Closed

A short drive from my home in North Carolina is a small Mexican restaurant, with several tables and four stools at a bar facing the kitchen. On a sweltering afternoon last summer, I walked in with my wife and daughter. The place was empty. But looking closer, I realized that business was booming. The bar was covered with to-go food: nine large brown bags.

As we ate our meal, I watched half a dozen people enter the restaurant without sitting down to eat. Each one pushed open the door, walked to the counter, picked up a bag from the bar, and left. In the delicate choreography between kitchen and customer, not a word was exchanged. The space once reserved for that most garrulous social encounter, the bar hangout, had been reconfigured into a silent depot for customers to grab food to eat at home.

Until the pandemic, the bar was bustling and popular with regulars. “It’s just a few seats, but it was a pretty happening place,” Rae Mosher, the restaurant’s general manager, told me. “I can’t tell you how sad I’ve been about it,” she went on. “I know it hinders communications between customers and staff to have to-go bags taking up the whole bar. But there’s nowhere else for the food to go.” She put up a sign: BAR SEATING CLOSED.

The sign on the bar is a sign of the times for the restaurant business. In the past few decades, the sector has shifted from tables to takeaway, a process that accelerated through the pandemic and continued even as the health emergency abated. In 2023, 74 percent of all restaurant traffic came from “off premises” customers—that is, from takeout and delivery—up from 61 percent before COVID, according to the National Restaurant Association.

The flip side of less dining out is more eating alone. The share of U.S. adults having dinner or drinks with friends on any given night has declined by more than 30 percent in the past 20 years. “There’s an isolationist dynamic that’s taking place in the restaurant business,” the Washington, D.C., restaurateur Steve Salis told me. “I think people feel uncomfortable in the world today. They’ve decided that their home is their sanctuary. It’s not easy to get them to leave.” Even when Americans eat at restaurants, they are much more likely to do so by themselves. According to data gathered by the online reservations platform OpenTable, solo dining has increased by 29 percent in just the past two years. The No. 1 reason is the need for more “me time.”

The evolution of restaurants is retracing the trajectory of another American industry: Hollywood. In the 1930s, video entertainment existed only in theaters, and the typical American went to the movies several times a month. Film was a necessarily collective experience, something enjoyed with friends and in the company of strangers. But technology has turned film into a home delivery system. Today, the typical American adult buys about three movie tickets a year—and watches almost 19 hours of television, the equivalent of roughly eight movies, on a weekly basis. In entertainment, as in dining, modernity has transformed a ritual of togetherness into an experience of homebound reclusion and even solitude.

The privatization of American leisure is one part of a much bigger story. Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965. Between that year and the end of the 20th century, in-person socializing slowly declined. From 2003 to 2023, it plunged by more than 20 percent, according to the American Time Use Survey, an annual study conducted by the Bureau of Labor Statistics. Among unmarried men and people younger than 25, the decline was more than 35 percent. Alone time predictably spiked during the pandemic. But the trend had started long before most people had ever heard of a novel coronavirus and continued after the pandemic was declared over. According to Enghin Atalay, an economist at the Federal Reserve Bank of Philadelphia, Americans spent even more time alone in 2023 than they did in 2021. (He categorized a person as “alone,” as I will throughout this article, if they are “the only person in the room, even if they are on the phone” or in front of a computer.)

Eroding companionship can be seen in numerous odd and depressing facts of American life today. Men who watch television now spend seven hours in front of the TV for every hour they spend hanging out with somebody outside their home. The typical female pet owner spends more time actively engaged with her pet than she spends in face-to-face contact with friends of her own species. Since the early 2000s, the amount of time that Americans say they spend helping or caring for people outside their nuclear family has declined by more than a third.

[Derek Thompson: Why Americans suddenly stopped hanging out]

Self-imposed solitude might just be the most important social fact of the 21st century in America. Perhaps unsurprisingly, many observers have reduced this phenomenon to the topic of loneliness. In 2023, Vivek Murthy, Joe Biden’s surgeon general, published an 81-page warning about America’s “epidemic of loneliness,” claiming that its negative health effects were on par with those of tobacco use and obesity. A growing number of public-health officials seem to regard loneliness as the developed world’s next critical public-health issue. The United Kingdom now has a minister for loneliness. So does Japan.

Max Guther

But solitude and loneliness are not one and the same. “It is actually a very healthy emotional response to feel some loneliness,” the NYU sociologist Eric Klinenberg told me. “That cue is the thing that pushes you off the couch and into face-to-face interaction.” The real problem here, the nature of America’s social crisis, is that most Americans don’t seem to be reacting to the biological cue to spend more time with other people. Their solitude levels are surging while many measures of loneliness are actually flat or dropping. A 2021 study of the widely used UCLA Loneliness Scale concluded that “the frequently used term ‘loneliness epidemic’ seems exaggerated.” Although young people are lonelier than they once were, there is little evidence that loneliness is rising more broadly today. A 2023 Gallup survey found that the share of Americans who said they experienced loneliness “a lot of the day yesterday” declined by roughly one-third from 2021 to 2023, even as alone time, by Atalay’s calculation, rose slightly.

Day to day, hour to hour, we are choosing this way of life—its comforts, its ready entertainments. But convenience can be a curse. Our habits are creating what Atalay has called a “century of solitude.” This is the anti-social century.

Over the past few months, I’ve spoken with psychologists, political scientists, sociologists, and technologists about America’s anti-social streak. Although the particulars of these conversations differed, a theme emerged: The individual preference for solitude, scaled up across society and exercised repeatedly over time, is rewiring America’s civic and psychic identity. And the consequences are far-reaching—for our happiness, our communities, our politics, and even our understanding of reality.

The End of the Social Century

The first half of the 20th century was extraordinarily social. From 1900 to 1960, church membership surged, as did labor-union participation. Marriage rates reached a record high after World War II, and the birth rate enjoyed a famous “boom.” Associations of all sorts thrived, including book clubs and volunteer groups. The New Deal made America’s branch-library system the envy of the world; communities and developers across the country built theaters, music venues, playgrounds, and all kinds of gathering places.

But in the 1970s, the U.S. entered an era of withdrawal, as the political scientist Robert D. Putnam famously documented in his 2000 book, Bowling Alone. Some institutions of togetherness, such as marriage, eroded slowly. Others fell away swiftly. From 1985 to 1994, active involvement in community organizations fell by nearly half. The decline was astonishingly broad, affecting just about every social activity and every demographic group that Putnam tracked.

What happened in the 1970s? Klinenberg, the sociologist, notes a shift in political priorities: The government dramatically slowed its construction of public spaces. “Places that used to anchor community life, like libraries and school gyms and union halls, have become less accessible or shuttered altogether,” he told me. Putnam points, among other things, to new moral values, such as the embrace of unbridled individualism. But he found that two of the most important factors were by then ubiquitous technologies: the automobile and the television set.

Starting in the second half of the century, Americans used their cars to move farther and farther away from one another, enabling the growth of the suburbs and, with it, a retreat into private backyard patios, private pools, a more private life. Once Americans got out of the car, they planted themselves in front of the television. From 1965 to 1995, the typical adult gained six hours a week in leisure time. They could have devoted that time—300 hours a year!—to community service, or pickup basketball, or reading, or knitting, or all four. Instead, they funneled almost all of this extra time into watching more TV.

Television transformed Americans’ interior decorating, our relationships, and our communities. In 1970, just 6 percent of sixth graders had a TV set in their bedroom; in 1999, that proportion had grown to 77 percent. Time diaries in the 1990s showed that husbands and wives spent almost four times as many hours watching TV together as they spent talking to each other in a given week. People who said TV was their “primary form of entertainment” were less likely to engage in practically every social activity that Putnam counted: volunteering, churchgoing, attending dinner parties, picnicking, giving blood, even sending greeting cards. Like a murder in Clue, the death of social connections in America had any number of suspects. But in the end, I believe the likeliest culprit is obvious. It was Mr. Farnsworth, in the living room, with the tube.

Phonebound

If two of the 20th century’s iconic technologies, the automobile and the television, initiated the rise of American aloneness, the 21st century’s most notorious piece of hardware has continued to fuel, and has indeed accelerated, our national anti-social streak. Countless books, articles, and cable-news segments have warned Americans that smartphones can negatively affect mental health and may be especially harmful to adolescents. But the fretful coverage is, if anything, restrained given how greatly these devices have changed our conscious experience. The typical person is awake for about 900 minutes a day. American kids and teenagers spend, on average, about 270 minutes on weekdays and 380 minutes on weekends gazing into their screens, according to the Digital Parenthood Initiative. By this account, screens occupy more than 30 percent of their waking life.

Some of this screen time is social, after a fashion. But sharing videos or texting friends is a pale imitation of face-to-face interaction. More worrisome than what young people do on their phone is what they aren’t doing. Young people are less likely than in previous decades to get their driver’s license, or to go on a date, or to have more than one close friend, or even to hang out with their friends at all. The share of boys and girls who say they meet up with friends almost daily outside school hours has declined by nearly 50 percent since the early 1990s, with the sharpest downturn occurring in the 2010s.

Max Guther

The decline of hanging out can’t be shrugged off as a benign generational change, something akin to a preference for bell-bottoms over skinny jeans. Human childhood—including adolescence—is a uniquely sensitive period in the whole of the animal kingdom, the psychologist Jonathan Haidt writes in The Anxious Generation. Although the human brain grows to 90 percent of its full size by age 5, its neural circuitry takes a long time to mature. Our lengthy childhood might be evolution’s way of scheduling an extended apprenticeship in social learning through play. The best kind of play is physical, outdoors, with other kids, and unsupervised, allowing children to press the limits of their abilities while figuring out how to manage conflict and tolerate pain. But now young people’s attention is funneled into devices that take them out of their body, denying them the physical-world education they need.

[Read: Jonathan Haidt on the terrible costs of a phone-based childhood]

Teen anxiety and depression are at near-record highs: The latest government survey of high schoolers, conducted in 2023, found that more than half of teen girls said they felt “persistently sad or hopeless.” These data are alarming, but shouldn’t be surprising. Young rats and monkeys deprived of play come away socially and emotionally impaired. It would be odd if we, the self-named “social animal,” were different.

Socially underdeveloped childhood leads, almost inexorably, to socially stunted adulthood. A popular trend on TikTok involves 20‑somethings celebrating in creative ways when a friend cancels plans, often because they’re too tired or anxious to leave the house. These clips can be goofy and even quite funny. Surely, sympathy is due; we all know the feeling of relief when we claw back free time in an overscheduled week. But the sheer number of videos is a bit unsettling. If anybody should feel lonely and desperate for physical-world contact, you’d think it would be 20-somethings, who are still recovering from years of pandemic cabin fever. But many nights, it seems, members of America’s most isolated generation aren’t trying to leave the house at all. They’re turning on their cameras to advertise to the world the joy of not hanging out.

If young adults feel overwhelmed by the emotional costs of physical-world togetherness—and prone to keeping even close friends at a physical distance—that suggests that phones aren’t just rewiring adolescence; they’re upending the psychology of friendship as well.

[From the September 2017 issue: Have smartphones destroyed a generation?]

In the 1960s, Irwin Altman, a psychologist at the Naval Medical Research Institute, in Bethesda, Maryland, co-developed a friendship formula characterized by increasing intimacy. In the early stages of friendship, people engage in small talk by sharing trivial details. As they develop trust, their conversations deepen to include more private information until disclosure becomes habitual and easy. Altman later added an important wrinkle: Friends require boundaries as much as they require closeness. Time alone to recharge is essential for maintaining healthy relationships.

Phones mean that solitude is more crowded than it used to be, and crowds are more solitary. “Bright lines once separated being alone and being in a crowd,” Nicholas Carr, the author of the new book Superbloom: How Technologies of Connection Tear Us Apart, told me. “Boundaries helped us. You could be present with your friends and reflective in your downtime.” Now our social time is haunted by the possibility that something more interesting is happening somewhere else, and our downtime is contaminated by the streams and posts and texts of dozens of friends, colleagues, frenemies, strangers.

[From the July/August 2008 issue: Nicholas Carr on whether Google is making us stupid]

If Carr is right, modern technology’s always-open window to the outside world makes recharging much harder, leaving many people chronically depleted, a walking battery that is always stuck in the red zone. In a healthy world, people who spend lots of time alone would feel that ancient biological cue: I’m alone and sad; I should make some plans. But we live in a sideways world, where easy home entertainment, oversharing online, and stunted social skills spark a strangely popular response: I’m alone, anxious, and exhausted; thank God my plans were canceled.

Homebound

Last year, the Princeton University sociologist Patrick Sharkey was working on a book about how places shape American lives and economic fortunes. He had a feeling that the rise of remote work might have accelerated a longer-term trend: a shift in the amount of time that people spend inside their home. He ran the numbers and discovered “an astounding change” in our daily habits, much more extreme than he would have guessed. In 2022—notably, after the pandemic had abated—adults spent an additional 99 minutes at home on any given day compared with 2003.

This finding formed the basis of a 2024 paper, “Homebound,” in which Sharkey calculated that, compared with 2003, Americans are more likely to take meetings from home, to shop from home, to be entertained at home, to eat at home, and even to worship at home. Practically the entire economy has reoriented itself to allow Americans to stay within their four walls. This phenomenon cannot be reduced to remote work. It is something far more totalizing—something more like “remote life.”

One might ask: Why wouldn’t Americans with means want to spend more time at home? In the past few decades, the typical American home has become bigger, more comfortable, and more entertaining. From 1973 to 2023, the size of the average new single-family house increased by 50 percent, and the share of new single-family houses that have air-conditioning doubled, to 98 percent. Streaming services, video-game consoles, and flatscreen TVs make the living room more diverting than any 20th-century theater or arcade. Yet conveniences can indeed be a curse. By Sharkey’s calculations, activities at home were associated with a “strong reduction” in self-reported happiness.

A homebound life doesn’t have to be a solitary life. In the 1970s, the typical household entertained more than once a month. But from the late 1970s to the late 1990s, the frequency of hosting friends for parties, games, dinners, and so on declined by 45 percent, according to data that Robert Putnam gathered. In the 20 years after Bowling Alone was published, the average amount of time that Americans spent hosting or attending social events declined another 32 percent.

As our homes have become less social, residential architecture has become more anti-social. Clifton Harness is a co-founder of TestFit, a firm that makes software to design layouts for new housing developments. He told me that the cardinal rule of contemporary apartment design is that every room is built to accommodate maximal screen time. “In design meetings with developers and architects, you have to assure everybody that there will be space for a wall-mounted flatscreen television in every room,” he said. “It used to be ‘Let’s make sure our rooms have great light.’ But now, when the question is ‘How do we give the most comfort to the most people?,’ the answer is to feed their screen addiction.” Bobby Fijan, a real-estate developer, said last year that “for the most part, apartments are built for Netflix and chill.” From studying floor plans, he noticed that bedrooms, walk-in closets, and other private spaces are growing. “I think we’re building for aloneness,” Fijan told me.

“Secular Monks”

In 2020, the philosopher and writer Andrew Taggart observed in an essay published in the religious journal First Things that a new flavor of masculinity seemed to be emerging: strong, obsessed with personal optimization, and proudly alone. Men and women alike have been delaying family formation; the median age at first marriage for men recently surpassed 30 for the first time in history. Taggart wrote that the men he knew seemed to be forgoing marriage and fatherhood with gusto. Instead of focusing their 30s and 40s on wedding bands and diapers, they were committed to working on their body, their bank account, and their meditation-sharpened minds. Taggart called these men “secular monks” for their combination of old-fashioned austerity and modern solipsism. “Practitioners submit themselves to ever more rigorous, monitored forms of ascetic self-control,” he wrote, “among them, cold showers, intermittent fasting, data-driven health optimization, and meditation boot camps.”

When I read Taggart’s essay last year, I felt a shock of recognition. In the previous months, I’d been captivated by a particular genre of social media: the viral “morning routine” video. If the protagonist is a man, he is typically handsome and rich. We see him wake up. We see him meditate. We see him write in his journal. We see him exercise, take supplements, take a cold plunge. What is most striking about these videos, however, is the element they typically lack: other people. In these little movies of a life well spent, the protagonists generally wake up alone and stay that way. We usually see no friends, no spouse, no children. These videos are advertisements for a luxurious form of modern monasticism that treats the presence of other people as, at best, an unwelcome distraction and, at worst, an unhealthy indulgence that is ideally avoided—like porn, perhaps, or Pop-Tarts.

[Read: The agony of texting with men]

Drawing major conclusions about modern masculinity from a handful of TikToks would be unwise. But the solitary man is not just a social-media phenomenon. Men spend more time alone than women, and young men are increasing their alone time faster than any other group, according to the American Time Use Survey.

Max Guther

Where is this alone time coming from? Liana C. Sayer, a sociologist at the University of Maryland, shared with me her analysis of how leisure time in the 21st century has changed for men and women. Sayer divided leisure into two broad categories: “engaged leisure,” which includes socializing, going to concerts, and playing sports; and “sedentary leisure,” which includes watching TV and playing video games. Compared with engaged leisure, which is more likely to be done with other people, sedentary leisure is more commonly done alone.

The most dramatic tendency that Sayer uncovered is that single men without kids—who have the most leisure time—are overwhelmingly likely to spend these hours by themselves. And the time they spend in solo sedentary leisure has increased, since 2003, more than that of any other group Sayer tracked. This is unfortunate because, as Sayer wrote, “well-being is higher among adults who spend larger shares of leisure with others.” Sedentary leisure, by contrast, was “associated with negative physical and mental health.”

Richard V. Reeves, the president of the American Institute for Boys and Men, told me that for men, as for women, something hard to define is lost when we pursue a life of isolationist comforts. He calls it “neededness”—the way we make ourselves essential to our families and community. “I think at some level, we all need to feel like we’re a jigsaw piece that’s going to fit into a jigsaw somewhere,” he said. This neededness can come in several forms: social, economic, or communitarian. Our children and partners can depend on us for care or income. Our colleagues can rely on us to finish a project, or to commiserate about an annoying boss. Our religious congregations and weekend poker parties can count on us to fill a pew or bring the dip.

But building these bridges to community takes energy, and today’s young men do not seem to be constructing these relationships in the same way that they used to. In place of neededness, despair is creeping in. Men who are un- or underemployed are especially vulnerable. Feeling unneeded “is actually, in some cases, literally fatal,” Reeves said. “If you look at the words that men use to describe themselves before they take their own lives, they are worthless and useless.” Since 2001, hundreds of thousands of men have died of drug overdoses, mostly from opioids and synthetics such as fentanyl. “If the level of drug-poisoning deaths had remained flat since 2001, we’d have had 400,000 fewer men die,” Reeves said. These drugs, he emphasized, are defined by their solitary nature: Opioids are not party drugs, but rather the opposite.

This Is Your Politics on Solitude

All of this time alone, at home, on the phone, is not just affecting us as individuals. It’s making society weaker, meaner, and more delusional. Marc J. Dunkelman, an author and a research fellow at Brown University, says that to see how chosen solitude is warping society at large, we must first acknowledge something a little counterintuitive: Today, many of our bonds are actually getting stronger.

Parents are spending more time with their children than they did several decades ago, and many couples and families maintain an unbroken flow of communication. “My wife and I have texted 10 times since we said goodbye today,” Dunkelman told me when I reached him at noon on a weekday. “When my 10-year-old daughter buys a Butterfinger at CVS, I get a phone notification about it.”

At the same time, messaging apps, TikTok streams, and subreddits keep us plugged into the thoughts and opinions of the global crowd that shares our interests. “When I watch a Cincinnati Bengals football game, I’m on a group text with beat reporters to whom I can ask questions, and they’ll respond,” Dunkelman said. “I can follow the live thoughts of football analysts on X.com, so that I’m practically watching the game over their shoulder. I live in Rhode Island, and those are connections that could have never existed 30 years ago.”

Home-based, phone-based culture has arguably solidified our closest and most distant connections, the inner ring of family and best friends (bound by blood and intimacy) and the outer ring of tribe (linked by shared affinities). But it’s wreaking havoc on the middle ring of “familiar but not intimate” relationships with the people who live around us, which Dunkelman calls the village. “These are your neighbors, the people in your town,” he said. We used to know them well; now we don’t.

The middle ring is key to social cohesion, Dunkelman said. Families teach us love, and tribes teach us loyalty. The village teaches us tolerance. Imagine that a local parent disagrees with you about affirmative action at a PTA meeting. Online, you might write him off as a political opponent who deserves your scorn. But in a school gym full of neighbors, you bite your tongue. As the year rolls on, you discover that your daughters are in the same dance class. At pickup, you swap stories about caring for aging relatives. Although your differences don’t disappear, they’re folded into a peaceful coexistence. And when the two of you sign up for a committee to draft a diversity statement for the school, you find that you can accommodate each other’s opposing views. “It’s politically moderating to meet thoughtful people in the real world who disagree with you,” Dunkelman said. But if PTA meetings are still frequently held in person, many other opportunities to meet and understand one’s neighbors are becoming a thing of the past. “An important implication of the death of the middle ring is that if you have no appreciation for why the other side has their narrative, you’ll want your own side to fight them without compromise.”

The village is our best arena for practicing productive disagreement and compromise—in other words, democracy. So it’s no surprise that the erosion of the village has coincided with the emergence of a grotesque style of politics, in which every election feels like an existential quest to vanquish an intramural enemy. For the past five decades, the American National Election Studies surveys have asked Democrats and Republicans to rate the opposing party on a “Feeling Thermometer” that ranges from zero (very cold/unfavorable) to 100 (very warm/favorable). In 2000, just 8 percent of partisans gave the other party a zero. By 2020, that figure had shot up to 40 percent. In a 2021 poll by Generation Lab/Axios, nearly a third of college students who identify as Republican said they wouldn’t even go on a date with a Democrat, and more than two-thirds of Democratic students said the same of members of the GOP.

Donald Trump’s victory in the 2024 presidential election had many causes, including inflation and frustration with Joe Biden’s leadership. But one source of Trump’s success may be that he is an avatar of the all-tribe, no-village style of performative confrontation. He stokes out-group animosity, and speaks to voters who are furiously intolerant of political difference. To cite just a few examples from the campaign, Trump called Democrats “enemies of the democracy” and the news media “enemies of the people,” and promised to “root out” the “radical-left thugs that live like vermin within the confines of our country, that lie and steal and cheat on elections.”

Max Guther

Social disconnection also helps explain progressives’ stubborn inability to understand Trump’s appeal. In the fall, one popular Democratic lawn sign read Harris Walz: Obviously. That sentiment, rejected by a majority of voters, indicates a failure to engage with the world as it really is. Dunkelman emailed me after the election to lament Democratic cluelessness. “How did those of us who live in elite circles not see how Trump was gaining popularity even among our literal neighbors?” he wrote. Too many progressives were mainlining left-wing media in the privacy of their home, oblivious that families down the street were drifting right. Even in the highly progressive borough of Brooklyn, New York, three in 10 voters chose Trump. If progressives still consider MAGA an alien movement, it is in part because they have made themselves strangers in their own land.

Practicing politics alone, on the internet, rather than in community isn’t only making us more likely to demonize and alienate our opponents, though that would be bad enough. It may also be encouraging deep nihilism. In 2018, a group of researchers led by Michael Bang Petersen, a Danish political scientist, began asking Americans to evaluate false rumors about Democratic and Republican politicians, including Trump and Hillary Clinton. “We were expecting a clear pattern of polarization,” Petersen told me, with people on the left sharing conspiracies about the right and vice versa. But some participants seemed drawn to any conspiracy theory so long as it was intended to destroy the established order. Members of this cohort commonly harbored racial or economic grievances. Perhaps more important, Petersen said, they tended to feel socially isolated. These aggravated loners agreed with many dark pronouncements, such as “I need chaos around me” and “When I think about our political and social institutions, I cannot help thinking ‘just let them all burn.’ ” Petersen and his colleagues coined a term to describe this cohort’s motivation: the need for chaos.

[Read: Derek Thompson on the Americans who need chaos]

Although chaotically inclined individuals score highly in a popular measure for loneliness, they don’t seem to seek the obvious remedy. “What they’re reaching out to get isn’t friendship at all but rather recognition and status,” Petersen said. For many socially isolated men in particular, for whom reality consists primarily of glowing screens in empty rooms, a vote for destruction is a politics of last resort—a way to leave one’s mark on a world where collective progress, or collective support of any kind, feels impossible.

The Introversion Delusion

Let us be fair to solitude, for a moment. As the father of a young child, I know well that a quiet night alone can be a balm. I have spent evenings alone at a bar, watching a baseball game, that felt ecstatically close to heaven. People cope with stress and grief and mundane disappointment in complex ways, and sometimes isolation is the best way to restore inner equilibrium.

But the dosage matters. A night alone away from a crying baby is one thing. A decade or more of chronic social disconnection is something else entirely. And people who spend more time alone, year after year, become meaningfully less happy. In his 2023 paper on the rise of 21st-century solitude, Atalay, at the Philadelphia Fed, calculated that by one measure, sociability means considerably more for happiness than money does: A five-percentage-point increase in alone time was associated with about the same decline in life satisfaction as was a 10 percent lower household income.

Max Guther

Nonetheless, many people keep choosing to spend free time alone, in their home, away from other people. Perhaps, one might think, they are making the right choice; after all, they must know themselves best. But a consistent finding of modern psychology is that people often don’t know what they want, or what will make them happy. The saying that “predictions are hard, especially about the future” applies with special weight to predictions about our own life. Time and again, what we expect to bring us peace—a bigger house, a luxury car, a job with twice the pay but half the leisure—only creates more anxiety. And at the top of this pile of things we mistakenly believe we want, there is aloneness.

[From the May 2012 issue: Is Facebook making us lonely?]

Several years ago, Nick Epley, a psychologist at the University of Chicago’s Booth School of Business, asked commuter-train passengers to make a prediction: How would they feel if asked to spend the ride talking with a stranger? Most participants predicted that quiet solitude would make for a better commute than having a long chat with someone they didn’t know. Then Epley’s team created an experiment in which some people were asked to keep to themselves, while others were instructed to talk with a stranger (“The longer the conversation, the better,” participants were told). Afterward, people filled out a questionnaire. How did they feel? Despite the broad assumption that the best commute is a silent one, the people instructed to talk with strangers actually reported feeling significantly more positive than those who’d kept to themselves. “A fundamental paradox at the core of human life is that we are highly social and made better in every way by being around people,” Epley said. “And yet over and over, we have opportunities to connect that we don’t take, or even actively reject, and it is a terrible mistake.”

Researchers have repeatedly validated Epley’s discovery. In 2020, the psychologists Seth Margolis and Sonja Lyubomirsky, at UC Riverside, asked people to behave like an extrovert for one week and like an introvert for another. Subjects received several reminders to act “assertive” and “spontaneous” or “quiet” and “reserved” depending on the week’s theme. Participants said they felt more positive emotions at the end of the extroversion week and more negative emotions at the end of the introversion week. Our modern economy, with its home-delivery conveniences, manipulates people into behaving like agoraphobes. But it turns out that we can be manipulated in the opposite direction. And we might be happier for it.

Our “mistaken” preference for solitude could emerge from a misplaced anxiety that other people aren’t that interested in talking with us, or that they would find our company bothersome. “But in reality,” Epley told me, “social interaction is not very uncertain, because of the principle of reciprocity. If you say hello to someone, they’ll typically say hello back to you. If you give somebody a compliment, they’ll typically say thank you.” Many people, it seems, are not social enough for their own good. They too often seek comfort in solitude, when they would actually find joy in connection.

Despite a consumer economy that seems optimized for introverted behavior, we would have happier days, years, and lives if we resisted the undertow of the convenience curse—if we talked with more strangers, belonged to more groups, and left the house for more activities.

The AI Century

The anti-social century has been bad enough: more anxiety and depression; more “need for chaos” in our politics. But I’m sorry to say that our collective detachment could still get worse. Or, to be more precise, weirder.

In May of last year, three employees of OpenAI, the artificial-intelligence company, sat onstage to introduce ChatGPT’s new real-time conversational-speech feature. A research scientist named Mark Chen held up a phone and, smiling, started speaking to it.

“Hey, ChatGPT, I’m Mark. How are you?” Mark said.

“Hello, Mark!” a cheery female voice responded.

“Hey, so I’m onstage right now,” Mark said. “I’m doing a live demo, and frankly I’m feeling a little bit nervous. Can you help me calm my nerves a little bit?”

“Oh, you’re doing a live demo right now?” the voice replied, projecting astonishment with eerie verisimilitude. “That’s awesome! Just take a deep breath and remember: You’re the expert here.”

Mark asked for feedback on his breathing, before panting loudly, like someone who’d just finished a marathon.

“Whoa, slow!” the voice responded. “Mark, you’re not a vacuum cleaner!” Out of frame, the audience laughed. Mark tried breathing audibly again, this time more slowly and deliberately.

“That’s it,” the AI responded. “How do you feel?”

“I feel a lot better,” Mark said. “Thank you so much.”

AI’s ability to speak naturally might seem like an incremental update, as subtle as a camera-lens refinement on a new iPhone. But according to Nick Epley, fluent speech represents a radical advancement in the technology’s ability to encroach on human relationships.

“Once an AI can speak to you, it’ll feel extremely real,” he said, because people process spoken word more intimately and emotionally than they process text. For a study published in 2020, Epley and Amit Kumar, a psychologist at the University of Texas at Austin, randomly assigned participants to contact an old friend via phone or email. Most people said they preferred to send a written message. But those instructed to talk on the phone reported feeling “a significantly stronger bond” with their friend, and a stronger sense that they’d “really connected,” than those who used email.

Speech is rich with what are known as “paralinguistic cues,” such as emphasis and intonation, which can build sympathy and trust in the minds of listeners. In another study, Epley and the behavioral scientist Juliana Schroeder found that employers and potential recruiters were more likely to rate candidates as “more competent, thoughtful, and intelligent” when they heard a why-I’m-right-for-this-job pitch rather than read it.

Even now, before AI has mastered fluent speech, millions of people are already forming intimate relationships with machines, according to Jason Fagone, a journalist who is writing a book about the emergence of AI companions. Character.ai, the most popular platform for AI companions, has tens of millions of monthly users, who spend an average of 93 minutes a day chatting with their AI friend. “No one is getting duped into thinking they’re actually talking to humans,” Fagone told me. “People are freely choosing to enter relationships with artificial partners, and they’re getting deeply attached anyway, because of the emotional capabilities of these systems.” One subject in his book is a young man who, after his fiancée’s death, engineers an AI chatbot to resemble his deceased partner. Another is a bisexual mother who supplements her marriage to a man with an AI that identifies as a woman.

If you find the notion of emotional intercourse with an immaterial entity creepy, consider the many friends and family members who exist in your life mainly as words on a screen. Digital communication has already prepared us for AI companionship, Fagone said, by transforming many of our physical-world relationships into a sequence of text chimes and blue bubbles. “I think part of why AI-companion apps have proven so seductive so quickly is that most of our relationships already happen exclusively through the phone,” he said.

Epley sees the exponential growth of AI companions as a real possibility. “You can set them up to never criticize you, never cheat on you, never have a bad day and insult you, and to always be interested in you.” Unlike the most patient spouses, they could tell us that we’re always right. Unlike the world’s best friend, they could instantly respond to our needs without the all-too-human distraction of having to lead their own life.

“The horrifying part, of course, is that learning how to interact with real human beings who can disagree with you and disappoint you” is essential to living in the world, Epley said. I think he’s right. But Epley was born in the 1970s. I was born in the 1980s. People born in the 2010s, or the 2020s, might not agree with us about the irreplaceability of “real human” friends. These generations may discover that what they want most from their relationships is not a set of people, who might challenge them, but rather a set of feelings—sympathy, humor, validation—that can be more reliably drawn out from silicon than from carbon-based life forms. Long before technologists build a superintelligent machine that can do the work of so many Einsteins, they may build an emotionally sophisticated one that can do the work of so many friends.

The Next 15 Minutes

The anti-social century is as much a result of what’s happened to the exterior world of concrete and steel as it is about advances inside our phones. The decline of government investments in what Eric Klinenberg calls “social infrastructure”—public spaces that shape our relationship to the world—may have begun in the latter part of the 20th century, but it has continued in the 21st. That has arguably affected nearly everyone, but less advantaged Americans most of all.

“I can’t tell you how many times I’ve gone to poor neighborhoods in big cities, and the community leaders tell me the real crisis for poor teenagers is that there’s just not much for them to do anymore, and nowhere to go,” Klinenberg told me. “I’d like to see the government build social infrastructure for teenagers with the creativity and generosity with which video-game companies build the toys that keep them inside. I’m thinking of athletic fields, and public swimming pools, and libraries with beautiful social areas for young people to hang out together.”

Improved public social infrastructure would not solve all the problems of the anti-social century. But degraded public spaces—and degraded public life—are in some ways the other side of all our investments in video games and phones and bigger, better private space. Just as we needed time to see the invisible emissions of the Industrial Revolution, we are only now coming to grips with the negative externalities of a phonebound and homebound world. The media theorist Marshall McLuhan once said of technology that every augmentation is also an amputation. We chose our digitally enhanced world. We did not realize the significance of what was being amputated.

Max Guther

But we can choose differently. In his 2015 novel, Seveneves, Neal Stephenson coined the term Amistics to describe the practice of carefully selecting which technologies to accept. The word is a reference to the Amish, who generally shun many modern innovations, including cars and television. Although they are sometimes considered strictly anti-modern, many Amish communities have refrigerators and washing machines, and some use solar power. Instead of dismissing all technology, the Amish adopt only those innovations that support their religious and communal values. In his 1998 dissertation on one Amish community, Tay Keong Tan, then a Ph.D. candidate at Harvard, quoted a community member as saying that they didn’t want to adopt TV or radio, because those products “would destroy our visiting practices. We would stay at home with the television or radio rather than meet with other people.”

If the Amish approach to technology is radical in its application, it recognizes something plain and true: Although technology does not have values of its own, its adoption can create values, even in the absence of a coordinated effort. For decades, we’ve adopted whatever technologies removed friction or increased dopamine, embracing what makes life feel easy and good in the moment. But dopamine is a chemical, not a virtue. And what’s easy is not always what’s best for us. We should ask ourselves: What would it mean to select technology based on long-term health rather than instant gratification? And if technology is hurting our community, what can we do to heal it?

A seemingly straightforward prescription is that teenagers should choose to spend less time on their phone, and their parents should choose to invite more friends over for dinner. But in a way, these are collective-action problems. A teenager is more likely to get out of the house if his classmates have already made a habit of hanging out. That teen’s parents are more likely to host if their neighbors have also made a habit of weekly gatherings. There is a word for such deeply etched communal habits: rituals. And one reason, perhaps, that the decline of socializing has synchronized with the decline of religion is that nothing has proved as adept at inscribing ritual into our calendars as faith.

“I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school.

When Epley and his lab asked Chicagoans to overcome their preference for solitude and talk with strangers on a train, the experiment probably didn’t change anyone’s life. All it did was marginally improve the experience of one 15-minute block of time. But life is just a long set of 15-minute blocks, one after another. The way we spend our minutes is the way we spend our decades. “No amount of research that I’ve done has changed my life more than this,” Epley told me. “It’s not that I’m never lonely. It’s that my moment-to-moment experience of life is better, because I’ve learned to take the dead space of life and make friends in it.”

This article appears in the February 2025 print edition with the headline “The Anti-Social Century.”