Itemoids

Afterward

The Anti-Social Century

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 02 › american-loneliness-personality-politics › 681091

This story seems to be about:

Illustrations by Max Guther

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

The Bar Is Closed

A short drive from my home in North Carolina is a small Mexican restaurant, with several tables and four stools at a bar facing the kitchen. On a sweltering afternoon last summer, I walked in with my wife and daughter. The place was empty. But looking closer, I realized that business was booming. The bar was covered with to-go food: nine large brown bags.

As we ate our meal, I watched half a dozen people enter the restaurant without sitting down to eat. Each one pushed open the door, walked to the counter, picked up a bag from the bar, and left. In the delicate choreography between kitchen and customer, not a word was exchanged. The space once reserved for that most garrulous social encounter, the bar hangout, had been reconfigured into a silent depot for customers to grab food to eat at home.

Until the pandemic, the bar was bustling and popular with regulars. “It’s just a few seats, but it was a pretty happening place,” Rae Mosher, the restaurant’s general manager, told me. “I can’t tell you how sad I’ve been about it,” she went on. “I know it hinders communications between customers and staff to have to-go bags taking up the whole bar. But there’s nowhere else for the food to go.” She put up a sign: BAR SEATING CLOSED.

The sign on the bar is a sign of the times for the restaurant business. In the past few decades, the sector has shifted from tables to takeaway, a process that accelerated through the pandemic and continued even as the health emergency abated. In 2023, 74 percent of all restaurant traffic came from “off premises” customers—that is, from takeout and delivery—up from 61 percent before COVID, according to the National Restaurant Association.

The flip side of less dining out is more eating alone. The share of U.S. adults having dinner or drinks with friends on any given night has declined by more than 30 percent in the past 20 years. “There’s an isolationist dynamic that’s taking place in the restaurant business,” the Washington, D.C., restaurateur Steve Salis told me. “I think people feel uncomfortable in the world today. They’ve decided that their home is their sanctuary. It’s not easy to get them to leave.” Even when Americans eat at restaurants, they are much more likely to do so by themselves. According to data gathered by the online reservations platform OpenTable, solo dining has increased by 29 percent in just the past two years. The No. 1 reason is the need for more “me time.”

The evolution of restaurants is retracing the trajectory of another American industry: Hollywood. In the 1930s, video entertainment existed only in theaters, and the typical American went to the movies several times a month. Film was a necessarily collective experience, something enjoyed with friends and in the company of strangers. But technology has turned film into a home delivery system. Today, the typical American adult buys about three movie tickets a year—and watches almost 19 hours of television, the equivalent of roughly eight movies, on a weekly basis. In entertainment, as in dining, modernity has transformed a ritual of togetherness into an experience of homebound reclusion and even solitude.

The privatization of American leisure is one part of a much bigger story. Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965. Between that year and the end of the 20th century, in-person socializing slowly declined. From 2003 to 2023, it plunged by more than 20 percent, according to the American Time Use Survey, an annual study conducted by the Bureau of Labor Statistics. Among unmarried men and people younger than 25, the decline was more than 35 percent. Alone time predictably spiked during the pandemic. But the trend had started long before most people had ever heard of a novel coronavirus and continued after the pandemic was declared over. According to Enghin Atalay, an economist at the Federal Reserve Bank of Philadelphia, Americans spent even more time alone in 2023 than they did in 2021. (He categorized a person as “alone,” as I will throughout this article, if they are “the only person in the room, even if they are on the phone” or in front of a computer.)

Eroding companionship can be seen in numerous odd and depressing facts of American life today. Men who watch television now spend seven hours in front of the TV for every hour they spend hanging out with somebody outside their home. The typical female pet owner spends more time actively engaged with her pet than she spends in face-to-face contact with friends of her own species. Since the early 2000s, the amount of time that Americans say they spend helping or caring for people outside their nuclear family has declined by more than a third.

[Derek Thompson: Why Americans suddenly stopped hanging out]

Self-imposed solitude might just be the most important social fact of the 21st century in America. Perhaps unsurprisingly, many observers have reduced this phenomenon to the topic of loneliness. In 2023, Vivek Murthy, Joe Biden’s surgeon general, published an 81-page warning about America’s “epidemic of loneliness,” claiming that its negative health effects were on par with those of tobacco use and obesity. A growing number of public-health officials seem to regard loneliness as the developed world’s next critical public-health issue. The United Kingdom now has a minister for loneliness. So does Japan.

Max Guther

But solitude and loneliness are not one and the same. “It is actually a very healthy emotional response to feel some loneliness,” the NYU sociologist Eric Klinenberg told me. “That cue is the thing that pushes you off the couch and into face-to-face interaction.” The real problem here, the nature of America’s social crisis, is that most Americans don’t seem to be reacting to the biological cue to spend more time with other people. Their solitude levels are surging while many measures of loneliness are actually flat or dropping. A 2021 study of the widely used UCLA Loneliness Scale concluded that “the frequently used term ‘loneliness epidemic’ seems exaggerated.” Although young people are lonelier than they once were, there is little evidence that loneliness is rising more broadly today. A 2023 Gallup survey found that the share of Americans who said they experienced loneliness “a lot of the day yesterday” declined by roughly one-third from 2021 to 2023, even as alone time, by Atalay’s calculation, rose slightly.

Day to day, hour to hour, we are choosing this way of life—its comforts, its ready entertainments. But convenience can be a curse. Our habits are creating what Atalay has called a “century of solitude.” This is the anti-social century.

Over the past few months, I’ve spoken with psychologists, political scientists, sociologists, and technologists about America’s anti-social streak. Although the particulars of these conversations differed, a theme emerged: The individual preference for solitude, scaled up across society and exercised repeatedly over time, is rewiring America’s civic and psychic identity. And the consequences are far-reaching—for our happiness, our communities, our politics, and even our understanding of reality.

The End of the Social Century

The first half of the 20th century was extraordinarily social. From 1900 to 1960, church membership surged, as did labor-union participation. Marriage rates reached a record high after World War II, and the birth rate enjoyed a famous “boom.” Associations of all sorts thrived, including book clubs and volunteer groups. The New Deal made America’s branch-library system the envy of the world; communities and developers across the country built theaters, music venues, playgrounds, and all kinds of gathering places.

But in the 1970s, the U.S. entered an era of withdrawal, as the political scientist Robert D. Putnam famously documented in his 2000 book, Bowling Alone. Some institutions of togetherness, such as marriage, eroded slowly. Others fell away swiftly. From 1985 to 1994, active involvement in community organizations fell by nearly half. The decline was astonishingly broad, affecting just about every social activity and every demographic group that Putnam tracked.

What happened in the 1970s? Klinenberg, the sociologist, notes a shift in political priorities: The government dramatically slowed its construction of public spaces. “Places that used to anchor community life, like libraries and school gyms and union halls, have become less accessible or shuttered altogether,” he told me. Putnam points, among other things, to new moral values, such as the embrace of unbridled individualism. But he found that two of the most important factors were by then ubiquitous technologies: the automobile and the television set.

Starting in the second half of the century, Americans used their cars to move farther and farther away from one another, enabling the growth of the suburbs and, with it, a retreat into private backyard patios, private pools, a more private life. Once Americans got out of the car, they planted themselves in front of the television. From 1965 to 1995, the typical adult gained six hours a week in leisure time. They could have devoted that time—300 hours a year!—to community service, or pickup basketball, or reading, or knitting, or all four. Instead, they funneled almost all of this extra time into watching more TV.

Television transformed Americans’ interior decorating, our relationships, and our communities. In 1970, just 6 percent of sixth graders had a TV set in their bedroom; in 1999, that proportion had grown to 77 percent. Time diaries in the 1990s showed that husbands and wives spent almost four times as many hours watching TV together as they spent talking to each other in a given week. People who said TV was their “primary form of entertainment” were less likely to engage in practically every social activity that Putnam counted: volunteering, churchgoing, attending dinner parties, picnicking, giving blood, even sending greeting cards. Like a murder in Clue, the death of social connections in America had any number of suspects. But in the end, I believe the likeliest culprit is obvious. It was Mr. Farnsworth, in the living room, with the tube.

Phonebound

If two of the 20th century’s iconic technologies, the automobile and the television, initiated the rise of American aloneness, the 21st century’s most notorious piece of hardware has continued to fuel, and has indeed accelerated, our national anti-social streak. Countless books, articles, and cable-news segments have warned Americans that smartphones can negatively affect mental health and may be especially harmful to adolescents. But the fretful coverage is, if anything, restrained given how greatly these devices have changed our conscious experience. The typical person is awake for about 900 minutes a day. American kids and teenagers spend, on average, about 270 minutes on weekdays and 380 minutes on weekends gazing into their screens, according to the Digital Parenthood Initiative. By this account, screens occupy more than 30 percent of their waking life.

Some of this screen time is social, after a fashion. But sharing videos or texting friends is a pale imitation of face-to-face interaction. More worrisome than what young people do on their phone is what they aren’t doing. Young people are less likely than in previous decades to get their driver’s license, or to go on a date, or to have more than one close friend, or even to hang out with their friends at all. The share of boys and girls who say they meet up with friends almost daily outside school hours has declined by nearly 50 percent since the early 1990s, with the sharpest downturn occurring in the 2010s.

Max Guther

The decline of hanging out can’t be shrugged off as a benign generational change, something akin to a preference for bell-bottoms over skinny jeans. Human childhood—including adolescence—is a uniquely sensitive period in the whole of the animal kingdom, the psychologist Jonathan Haidt writes in The Anxious Generation. Although the human brain grows to 90 percent of its full size by age 5, its neural circuitry takes a long time to mature. Our lengthy childhood might be evolution’s way of scheduling an extended apprenticeship in social learning through play. The best kind of play is physical, outdoors, with other kids, and unsupervised, allowing children to press the limits of their abilities while figuring out how to manage conflict and tolerate pain. But now young people’s attention is funneled into devices that take them out of their body, denying them the physical-world education they need.

[Read: Jonathan Haidt on the terrible costs of a phone-based childhood]

Teen anxiety and depression are at near-record highs: The latest government survey of high schoolers, conducted in 2023, found that more than half of teen girls said they felt “persistently sad or hopeless.” These data are alarming, but shouldn’t be surprising. Young rats and monkeys deprived of play come away socially and emotionally impaired. It would be odd if we, the self-named “social animal,” were different.

Socially underdeveloped childhood leads, almost inexorably, to socially stunted adulthood. A popular trend on TikTok involves 20‑somethings celebrating in creative ways when a friend cancels plans, often because they’re too tired or anxious to leave the house. These clips can be goofy and even quite funny. Surely, sympathy is due; we all know the feeling of relief when we claw back free time in an overscheduled week. But the sheer number of videos is a bit unsettling. If anybody should feel lonely and desperate for physical-world contact, you’d think it would be 20-somethings, who are still recovering from years of pandemic cabin fever. But many nights, it seems, members of America’s most isolated generation aren’t trying to leave the house at all. They’re turning on their cameras to advertise to the world the joy of not hanging out.

If young adults feel overwhelmed by the emotional costs of physical-world togetherness—and prone to keeping even close friends at a physical distance—that suggests that phones aren’t just rewiring adolescence; they’re upending the psychology of friendship as well.

[From the September 2017 issue: Have smartphones destroyed a generation?]

In the 1960s, Irwin Altman, a psychologist at the Naval Medical Research Institute, in Bethesda, Maryland, co-developed a friendship formula characterized by increasing intimacy. In the early stages of friendship, people engage in small talk by sharing trivial details. As they develop trust, their conversations deepen to include more private information until disclosure becomes habitual and easy. Altman later added an important wrinkle: Friends require boundaries as much as they require closeness. Time alone to recharge is essential for maintaining healthy relationships.

Phones mean that solitude is more crowded than it used to be, and crowds are more solitary. “Bright lines once separated being alone and being in a crowd,” Nicholas Carr, the author of the new book Superbloom: How Technologies of Connection Tear Us Apart, told me. “Boundaries helped us. You could be present with your friends and reflective in your downtime.” Now our social time is haunted by the possibility that something more interesting is happening somewhere else, and our downtime is contaminated by the streams and posts and texts of dozens of friends, colleagues, frenemies, strangers.

[From the July/August 2008 issue: Nicholas Carr on whether Google is making us stupid]

If Carr is right, modern technology’s always-open window to the outside world makes recharging much harder, leaving many people chronically depleted, a walking battery that is always stuck in the red zone. In a healthy world, people who spend lots of time alone would feel that ancient biological cue: I’m alone and sad; I should make some plans. But we live in a sideways world, where easy home entertainment, oversharing online, and stunted social skills spark a strangely popular response: I’m alone, anxious, and exhausted; thank God my plans were canceled.

Homebound

Last year, the Princeton University sociologist Patrick Sharkey was working on a book about how places shape American lives and economic fortunes. He had a feeling that the rise of remote work might have accelerated a longer-term trend: a shift in the amount of time that people spend inside their home. He ran the numbers and discovered “an astounding change” in our daily habits, much more extreme than he would have guessed. In 2022—notably, after the pandemic had abated—adults spent an additional 99 minutes at home on any given day compared with 2003.

This finding formed the basis of a 2024 paper, “Homebound,” in which Sharkey calculated that, compared with 2003, Americans are more likely to take meetings from home, to shop from home, to be entertained at home, to eat at home, and even to worship at home. Practically the entire economy has reoriented itself to allow Americans to stay within their four walls. This phenomenon cannot be reduced to remote work. It is something far more totalizing—something more like “remote life.”

One might ask: Why wouldn’t Americans with means want to spend more time at home? In the past few decades, the typical American home has become bigger, more comfortable, and more entertaining. From 1973 to 2023, the size of the average new single-family house increased by 50 percent, and the share of new single-family houses that have air-conditioning doubled, to 98 percent. Streaming services, video-game consoles, and flatscreen TVs make the living room more diverting than any 20th-century theater or arcade. Yet conveniences can indeed be a curse. By Sharkey’s calculations, activities at home were associated with a “strong reduction” in self-reported happiness.

A homebound life doesn’t have to be a solitary life. In the 1970s, the typical household entertained more than once a month. But from the late 1970s to the late 1990s, the frequency of hosting friends for parties, games, dinners, and so on declined by 45 percent, according to data that Robert Putnam gathered. In the 20 years after Bowling Alone was published, the average amount of time that Americans spent hosting or attending social events declined another 32 percent.

As our homes have become less social, residential architecture has become more anti-social. Clifton Harness is a co-founder of TestFit, a firm that makes software to design layouts for new housing developments. He told me that the cardinal rule of contemporary apartment design is that every room is built to accommodate maximal screen time. “In design meetings with developers and architects, you have to assure everybody that there will be space for a wall-mounted flatscreen television in every room,” he said. “It used to be ‘Let’s make sure our rooms have great light.’ But now, when the question is ‘How do we give the most comfort to the most people?,’ the answer is to feed their screen addiction.” Bobby Fijan, a real-estate developer, said last year that “for the most part, apartments are built for Netflix and chill.” From studying floor plans, he noticed that bedrooms, walk-in closets, and other private spaces are growing. “I think we’re building for aloneness,” Fijan told me.

“Secular Monks”

In 2020, the philosopher and writer Andrew Taggart observed in an essay published in the religious journal First Things that a new flavor of masculinity seemed to be emerging: strong, obsessed with personal optimization, and proudly alone. Men and women alike have been delaying family formation; the median age at first marriage for men recently surpassed 30 for the first time in history. Taggart wrote that the men he knew seemed to be forgoing marriage and fatherhood with gusto. Instead of focusing their 30s and 40s on wedding bands and diapers, they were committed to working on their body, their bank account, and their meditation-sharpened minds. Taggart called these men “secular monks” for their combination of old-fashioned austerity and modern solipsism. “Practitioners submit themselves to ever more rigorous, monitored forms of ascetic self-control,” he wrote, “among them, cold showers, intermittent fasting, data-driven health optimization, and meditation boot camps.”

When I read Taggart’s essay last year, I felt a shock of recognition. In the previous months, I’d been captivated by a particular genre of social media: the viral “morning routine” video. If the protagonist is a man, he is typically handsome and rich. We see him wake up. We see him meditate. We see him write in his journal. We see him exercise, take supplements, take a cold plunge. What is most striking about these videos, however, is the element they typically lack: other people. In these little movies of a life well spent, the protagonists generally wake up alone and stay that way. We usually see no friends, no spouse, no children. These videos are advertisements for a luxurious form of modern monasticism that treats the presence of other people as, at best, an unwelcome distraction and, at worst, an unhealthy indulgence that is ideally avoided—like porn, perhaps, or Pop-Tarts.

[Read: The agony of texting with men]

Drawing major conclusions about modern masculinity from a handful of TikToks would be unwise. But the solitary man is not just a social-media phenomenon. Men spend more time alone than women, and young men are increasing their alone time faster than any other group, according to the American Time Use Survey.

Max Guther

Where is this alone time coming from? Liana C. Sayer, a sociologist at the University of Maryland, shared with me her analysis of how leisure time in the 21st century has changed for men and women. Sayer divided leisure into two broad categories: “engaged leisure,” which includes socializing, going to concerts, and playing sports; and “sedentary leisure,” which includes watching TV and playing video games. Compared with engaged leisure, which is more likely to be done with other people, sedentary leisure is more commonly done alone.

The most dramatic tendency that Sayer uncovered is that single men without kids—who have the most leisure time—are overwhelmingly likely to spend these hours by themselves. And the time they spend in solo sedentary leisure has increased, since 2003, more than that of any other group Sayer tracked. This is unfortunate because, as Sayer wrote, “well-being is higher among adults who spend larger shares of leisure with others.” Sedentary leisure, by contrast, was “associated with negative physical and mental health.”

Richard V. Reeves, the president of the American Institute for Boys and Men, told me that for men, as for women, something hard to define is lost when we pursue a life of isolationist comforts. He calls it “neededness”—the way we make ourselves essential to our families and community. “I think at some level, we all need to feel like we’re a jigsaw piece that’s going to fit into a jigsaw somewhere,” he said. This neededness can come in several forms: social, economic, or communitarian. Our children and partners can depend on us for care or income. Our colleagues can rely on us to finish a project, or to commiserate about an annoying boss. Our religious congregations and weekend poker parties can count on us to fill a pew or bring the dip.

But building these bridges to community takes energy, and today’s young men do not seem to be constructing these relationships in the same way that they used to. In place of neededness, despair is creeping in. Men who are un- or underemployed are especially vulnerable. Feeling unneeded “is actually, in some cases, literally fatal,” Reeves said. “If you look at the words that men use to describe themselves before they take their own lives, they are worthless and useless.” Since 2001, hundreds of thousands of men have died of drug overdoses, mostly from opioids and synthetics such as fentanyl. “If the level of drug-poisoning deaths had remained flat since 2001, we’d have had 400,000 fewer men die,” Reeves said. These drugs, he emphasized, are defined by their solitary nature: Opioids are not party drugs, but rather the opposite.

This Is Your Politics on Solitude

All of this time alone, at home, on the phone, is not just affecting us as individuals. It’s making society weaker, meaner, and more delusional. Marc J. Dunkelman, an author and a research fellow at Brown University, says that to see how chosen solitude is warping society at large, we must first acknowledge something a little counterintuitive: Today, many of our bonds are actually getting stronger.

Parents are spending more time with their children than they did several decades ago, and many couples and families maintain an unbroken flow of communication. “My wife and I have texted 10 times since we said goodbye today,” Dunkelman told me when I reached him at noon on a weekday. “When my 10-year-old daughter buys a Butterfinger at CVS, I get a phone notification about it.”

At the same time, messaging apps, TikTok streams, and subreddits keep us plugged into the thoughts and opinions of the global crowd that shares our interests. “When I watch a Cincinnati Bengals football game, I’m on a group text with beat reporters to whom I can ask questions, and they’ll respond,” Dunkelman said. “I can follow the live thoughts of football analysts on X.com, so that I’m practically watching the game over their shoulder. I live in Rhode Island, and those are connections that could have never existed 30 years ago.”

Home-based, phone-based culture has arguably solidified our closest and most distant connections, the inner ring of family and best friends (bound by blood and intimacy) and the outer ring of tribe (linked by shared affinities). But it’s wreaking havoc on the middle ring of “familiar but not intimate” relationships with the people who live around us, which Dunkelman calls the village. “These are your neighbors, the people in your town,” he said. We used to know them well; now we don’t.

The middle ring is key to social cohesion, Dunkelman said. Families teach us love, and tribes teach us loyalty. The village teaches us tolerance. Imagine that a local parent disagrees with you about affirmative action at a PTA meeting. Online, you might write him off as a political opponent who deserves your scorn. But in a school gym full of neighbors, you bite your tongue. As the year rolls on, you discover that your daughters are in the same dance class. At pickup, you swap stories about caring for aging relatives. Although your differences don’t disappear, they’re folded into a peaceful coexistence. And when the two of you sign up for a committee to draft a diversity statement for the school, you find that you can accommodate each other’s opposing views. “It’s politically moderating to meet thoughtful people in the real world who disagree with you,” Dunkelman said. But if PTA meetings are still frequently held in person, many other opportunities to meet and understand one’s neighbors are becoming a thing of the past. “An important implication of the death of the middle ring is that if you have no appreciation for why the other side has their narrative, you’ll want your own side to fight them without compromise.”

The village is our best arena for practicing productive disagreement and compromise—in other words, democracy. So it’s no surprise that the erosion of the village has coincided with the emergence of a grotesque style of politics, in which every election feels like an existential quest to vanquish an intramural enemy. For the past five decades, the American National Election Studies surveys have asked Democrats and Republicans to rate the opposing party on a “Feeling Thermometer” that ranges from zero (very cold/unfavorable) to 100 (very warm/favorable). In 2000, just 8 percent of partisans gave the other party a zero. By 2020, that figure had shot up to 40 percent. In a 2021 poll by Generation Lab/Axios, nearly a third of college students who identify as Republican said they wouldn’t even go on a date with a Democrat, and more than two-thirds of Democratic students said the same of members of the GOP.

Donald Trump’s victory in the 2024 presidential election had many causes, including inflation and frustration with Joe Biden’s leadership. But one source of Trump’s success may be that he is an avatar of the all-tribe, no-village style of performative confrontation. He stokes out-group animosity, and speaks to voters who are furiously intolerant of political difference. To cite just a few examples from the campaign, Trump called Democrats “enemies of the democracy” and the news media “enemies of the people,” and promised to “root out” the “radical-left thugs that live like vermin within the confines of our country, that lie and steal and cheat on elections.”

Max Guther

Social disconnection also helps explain progressives’ stubborn inability to understand Trump’s appeal. In the fall, one popular Democratic lawn sign read Harris Walz: Obviously. That sentiment, rejected by a majority of voters, indicates a failure to engage with the world as it really is. Dunkelman emailed me after the election to lament Democratic cluelessness. “How did those of us who live in elite circles not see how Trump was gaining popularity even among our literal neighbors?” he wrote. Too many progressives were mainlining left-wing media in the privacy of their home, oblivious that families down the street were drifting right. Even in the highly progressive borough of Brooklyn, New York, three in 10 voters chose Trump. If progressives still consider MAGA an alien movement, it is in part because they have made themselves strangers in their own land.

Practicing politics alone, on the internet, rather than in community isn’t only making us more likely to demonize and alienate our opponents, though that would be bad enough. It may also be encouraging deep nihilism. In 2018, a group of researchers led by Michael Bang Petersen, a Danish political scientist, began asking Americans to evaluate false rumors about Democratic and Republican politicians, including Trump and Hillary Clinton. “We were expecting a clear pattern of polarization,” Petersen told me, with people on the left sharing conspiracies about the right and vice versa. But some participants seemed drawn to any conspiracy theory so long as it was intended to destroy the established order. Members of this cohort commonly harbored racial or economic grievances. Perhaps more important, Petersen said, they tended to feel socially isolated. These aggravated loners agreed with many dark pronouncements, such as “I need chaos around me” and “When I think about our political and social institutions, I cannot help thinking ‘just let them all burn.’ ” Petersen and his colleagues coined a term to describe this cohort’s motivation: the need for chaos.

[Read: Derek Thompson on the Americans who need chaos]

Although chaotically inclined individuals score highly in a popular measure for loneliness, they don’t seem to seek the obvious remedy. “What they’re reaching out to get isn’t friendship at all but rather recognition and status,” Petersen said. For many socially isolated men in particular, for whom reality consists primarily of glowing screens in empty rooms, a vote for destruction is a politics of last resort—a way to leave one’s mark on a world where collective progress, or collective support of any kind, feels impossible.

The Introversion Delusion

Let us be fair to solitude, for a moment. As the father of a young child, I know well that a quiet night alone can be a balm. I have spent evenings alone at a bar, watching a baseball game, that felt ecstatically close to heaven. People cope with stress and grief and mundane disappointment in complex ways, and sometimes isolation is the best way to restore inner equilibrium.

But the dosage matters. A night alone away from a crying baby is one thing. A decade or more of chronic social disconnection is something else entirely. And people who spend more time alone, year after year, become meaningfully less happy. In his 2023 paper on the rise of 21st-century solitude, Atalay, at the Philadelphia Fed, calculated that by one measure, sociability means considerably more for happiness than money does: A five-percentage-point increase in alone time was associated with about the same decline in life satisfaction as was a 10 percent lower household income.

Max Guther

Nonetheless, many people keep choosing to spend free time alone, in their home, away from other people. Perhaps, one might think, they are making the right choice; after all, they must know themselves best. But a consistent finding of modern psychology is that people often don’t know what they want, or what will make them happy. The saying that “predictions are hard, especially about the future” applies with special weight to predictions about our own life. Time and again, what we expect to bring us peace—a bigger house, a luxury car, a job with twice the pay but half the leisure—only creates more anxiety. And at the top of this pile of things we mistakenly believe we want, there is aloneness.

[From the May 2012 issue: Is Facebook making us lonely?]

Several years ago, Nick Epley, a psychologist at the University of Chicago’s Booth School of Business, asked commuter-train passengers to make a prediction: How would they feel if asked to spend the ride talking with a stranger? Most participants predicted that quiet solitude would make for a better commute than having a long chat with someone they didn’t know. Then Epley’s team created an experiment in which some people were asked to keep to themselves, while others were instructed to talk with a stranger (“The longer the conversation, the better,” participants were told). Afterward, people filled out a questionnaire. How did they feel? Despite the broad assumption that the best commute is a silent one, the people instructed to talk with strangers actually reported feeling significantly more positive than those who’d kept to themselves. “A fundamental paradox at the core of human life is that we are highly social and made better in every way by being around people,” Epley said. “And yet over and over, we have opportunities to connect that we don’t take, or even actively reject, and it is a terrible mistake.”

Researchers have repeatedly validated Epley’s discovery. In 2020, the psychologists Seth Margolis and Sonja Lyubomirsky, at UC Riverside, asked people to behave like an extrovert for one week and like an introvert for another. Subjects received several reminders to act “assertive” and “spontaneous” or “quiet” and “reserved” depending on the week’s theme. Participants said they felt more positive emotions at the end of the extroversion week and more negative emotions at the end of the introversion week. Our modern economy, with its home-delivery conveniences, manipulates people into behaving like agoraphobes. But it turns out that we can be manipulated in the opposite direction. And we might be happier for it.

Our “mistaken” preference for solitude could emerge from a misplaced anxiety that other people aren’t that interested in talking with us, or that they would find our company bothersome. “But in reality,” Epley told me, “social interaction is not very uncertain, because of the principle of reciprocity. If you say hello to someone, they’ll typically say hello back to you. If you give somebody a compliment, they’ll typically say thank you.” Many people, it seems, are not social enough for their own good. They too often seek comfort in solitude, when they would actually find joy in connection.

Despite a consumer economy that seems optimized for introverted behavior, we would have happier days, years, and lives if we resisted the undertow of the convenience curse—if we talked with more strangers, belonged to more groups, and left the house for more activities.

The AI Century

The anti-social century has been bad enough: more anxiety and depression; more “need for chaos” in our politics. But I’m sorry to say that our collective detachment could still get worse. Or, to be more precise, weirder.

In May of last year, three employees of OpenAI, the artificial-intelligence company, sat onstage to introduce ChatGPT’s new real-time conversational-speech feature. A research scientist named Mark Chen held up a phone and, smiling, started speaking to it.

“Hey, ChatGPT, I’m Mark. How are you?” Mark said.

“Hello, Mark!” a cheery female voice responded.

“Hey, so I’m onstage right now,” Mark said. “I’m doing a live demo, and frankly I’m feeling a little bit nervous. Can you help me calm my nerves a little bit?”

“Oh, you’re doing a live demo right now?” the voice replied, projecting astonishment with eerie verisimilitude. “That’s awesome! Just take a deep breath and remember: You’re the expert here.”

Mark asked for feedback on his breathing, before panting loudly, like someone who’d just finished a marathon.

“Whoa, slow!” the voice responded. “Mark, you’re not a vacuum cleaner!” Out of frame, the audience laughed. Mark tried breathing audibly again, this time more slowly and deliberately.

“That’s it,” the AI responded. “How do you feel?”

“I feel a lot better,” Mark said. “Thank you so much.”

AI’s ability to speak naturally might seem like an incremental update, as subtle as a camera-lens refinement on a new iPhone. But according to Nick Epley, fluent speech represents a radical advancement in the technology’s ability to encroach on human relationships.

“Once an AI can speak to you, it’ll feel extremely real,” he said, because people process spoken word more intimately and emotionally than they process text. For a study published in 2020, Epley and Amit Kumar, a psychologist at the University of Texas at Austin, randomly assigned participants to contact an old friend via phone or email. Most people said they preferred to send a written message. But those instructed to talk on the phone reported feeling “a significantly stronger bond” with their friend, and a stronger sense that they’d “really connected,” than those who used email.

Speech is rich with what are known as “paralinguistic cues,” such as emphasis and intonation, which can build sympathy and trust in the minds of listeners. In another study, Epley and the behavioral scientist Juliana Schroeder found that employers and potential recruiters were more likely to rate candidates as “more competent, thoughtful, and intelligent” when they heard a why-I’m-right-for-this-job pitch rather than read it.

Even now, before AI has mastered fluent speech, millions of people are already forming intimate relationships with machines, according to Jason Fagone, a journalist who is writing a book about the emergence of AI companions. Character.ai, the most popular platform for AI companions, has tens of millions of monthly users, who spend an average of 93 minutes a day chatting with their AI friend. “No one is getting duped into thinking they’re actually talking to humans,” Fagone told me. “People are freely choosing to enter relationships with artificial partners, and they’re getting deeply attached anyway, because of the emotional capabilities of these systems.” One subject in his book is a young man who, after his fiancée’s death, engineers an AI chatbot to resemble his deceased partner. Another is a bisexual mother who supplements her marriage to a man with an AI that identifies as a woman.

If you find the notion of emotional intercourse with an immaterial entity creepy, consider the many friends and family members who exist in your life mainly as words on a screen. Digital communication has already prepared us for AI companionship, Fagone said, by transforming many of our physical-world relationships into a sequence of text chimes and blue bubbles. “I think part of why AI-companion apps have proven so seductive so quickly is that most of our relationships already happen exclusively through the phone,” he said.

Epley sees the exponential growth of AI companions as a real possibility. “You can set them up to never criticize you, never cheat on you, never have a bad day and insult you, and to always be interested in you.” Unlike the most patient spouses, they could tell us that we’re always right. Unlike the world’s best friend, they could instantly respond to our needs without the all-too-human distraction of having to lead their own life.

“The horrifying part, of course, is that learning how to interact with real human beings who can disagree with you and disappoint you” is essential to living in the world, Epley said. I think he’s right. But Epley was born in the 1970s. I was born in the 1980s. People born in the 2010s, or the 2020s, might not agree with us about the irreplaceability of “real human” friends. These generations may discover that what they want most from their relationships is not a set of people, who might challenge them, but rather a set of feelings—sympathy, humor, validation—that can be more reliably drawn out from silicon than from carbon-based life forms. Long before technologists build a superintelligent machine that can do the work of so many Einsteins, they may build an emotionally sophisticated one that can do the work of so many friends.

The Next 15 Minutes

The anti-social century is as much a result of what’s happened to the exterior world of concrete and steel as it is about advances inside our phones. The decline of government investments in what Eric Klinenberg calls “social infrastructure”—public spaces that shape our relationship to the world—may have begun in the latter part of the 20th century, but it has continued in the 21st. That has arguably affected nearly everyone, but less advantaged Americans most of all.

“I can’t tell you how many times I’ve gone to poor neighborhoods in big cities, and the community leaders tell me the real crisis for poor teenagers is that there’s just not much for them to do anymore, and nowhere to go,” Klinenberg told me. “I’d like to see the government build social infrastructure for teenagers with the creativity and generosity with which video-game companies build the toys that keep them inside. I’m thinking of athletic fields, and public swimming pools, and libraries with beautiful social areas for young people to hang out together.”

Improved public social infrastructure would not solve all the problems of the anti-social century. But degraded public spaces—and degraded public life—are in some ways the other side of all our investments in video games and phones and bigger, better private space. Just as we needed time to see the invisible emissions of the Industrial Revolution, we are only now coming to grips with the negative externalities of a phonebound and homebound world. The media theorist Marshall McLuhan once said of technology that every augmentation is also an amputation. We chose our digitally enhanced world. We did not realize the significance of what was being amputated.

Max Guther

But we can choose differently. In his 2015 novel, Seveneves, Neal Stephenson coined the term Amistics to describe the practice of carefully selecting which technologies to accept. The word is a reference to the Amish, who generally shun many modern innovations, including cars and television. Although they are sometimes considered strictly anti-modern, many Amish communities have refrigerators and washing machines, and some use solar power. Instead of dismissing all technology, the Amish adopt only those innovations that support their religious and communal values. In his 1998 dissertation on one Amish community, Tay Keong Tan, then a Ph.D. candidate at Harvard, quoted a community member as saying that they didn’t want to adopt TV or radio, because those products “would destroy our visiting practices. We would stay at home with the television or radio rather than meet with other people.”

If the Amish approach to technology is radical in its application, it recognizes something plain and true: Although technology does not have values of its own, its adoption can create values, even in the absence of a coordinated effort. For decades, we’ve adopted whatever technologies removed friction or increased dopamine, embracing what makes life feel easy and good in the moment. But dopamine is a chemical, not a virtue. And what’s easy is not always what’s best for us. We should ask ourselves: What would it mean to select technology based on long-term health rather than instant gratification? And if technology is hurting our community, what can we do to heal it?

A seemingly straightforward prescription is that teenagers should choose to spend less time on their phone, and their parents should choose to invite more friends over for dinner. But in a way, these are collective-action problems. A teenager is more likely to get out of the house if his classmates have already made a habit of hanging out. That teen’s parents are more likely to host if their neighbors have also made a habit of weekly gatherings. There is a word for such deeply etched communal habits: rituals. And one reason, perhaps, that the decline of socializing has synchronized with the decline of religion is that nothing has proved as adept at inscribing ritual into our calendars as faith.

“I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school.

When Epley and his lab asked Chicagoans to overcome their preference for solitude and talk with strangers on a train, the experiment probably didn’t change anyone’s life. All it did was marginally improve the experience of one 15-minute block of time. But life is just a long set of 15-minute blocks, one after another. The way we spend our minutes is the way we spend our decades. “No amount of research that I’ve done has changed my life more than this,” Epley told me. “It’s not that I’m never lonely. It’s that my moment-to-moment experience of life is better, because I’ve learned to take the dead space of life and make friends in it.”

This article appears in the February 2025 print edition with the headline “The Anti-Social Century.”

The Rise of John Ratcliffe

The Atlantic

www.theatlantic.com › international › archive › 2025 › 01 › ratcliffe-dni-cia-trump › 681197

This story seems to be about:

In September 2016, the CIA sent a classified memo to the FBI, which was investigating Russian interference in the presidential election. According to Russian intelligence sources, Hillary Clinton had approved a plan to publicly tie Donald Trump to the country’s hack of the Democratic National Committee. The Russians reportedly said that Clinton wanted to distract the public from the scandal over her use of a private email server while she was secretary of state.

As secret tips from spies go, this one was not earth-shattering. FBI agents didn’t need the CIA to tell them that Clinton was painting Trump as an ally of the Kremlin—her campaign chair was on CNN saying just that. Trump was also making Clinton’s case for her: In late July, he had publicly encouraged the Russians to hack her email, which they then tried to do.

The CIA memo may have been obvious and not particularly useful. But it did contain “sensitive information that could be source revealing,” its authors cautioned, so the information was limited to those with a “need-to-know” status and “should not be released in any form.” Exposing human sources—spies—compromises intelligence gathering and can sometimes get them killed. For four years, the document’s stewards complied and kept it secret. Then it caught the attention of John Ratcliffe, President Trump’s director of national intelligence.

[Read: Clinton: Just trust me on this one]

Ratcliffe had been a divisive pick for the nation’s top intelligence adviser, made late in Trump’s term. His critics said he lacked sufficient national-security experience and was a partisan warrior. As a freshman Republican congressman from Texas, he had risen to national prominence by suggesting a theory, during committee hearings and television appearances, that Clinton had engineered the FBI’s investigation into the Trump campaign’s possible connections to Russian interference. (Ratcliffe surely knew that she had not, because this had been exhaustively established by multiple investigations, including one led by Senate Republicans.)

In late September 2020, weeks before voters would choose between Trump and Joe Biden, Ratcliffe declassified and released the CIA memo, along with some notes from an intelligence briefing given to President Barack Obama. He claimed that he was responding to requests from Congress to shed light on the FBI’s Russia investigation, but the documents didn’t provide much new information.

Intelligence officials were appalled. History had repeatedly, painfully, shown that politics and intelligence were a dangerous mix, and as the DNI, Ratcliffe was expected to avoid partisan behavior and safeguard sources and methods. Also, officials warned, the Russians might have wanted that memo to be released; even four years on, anything mentioning Clinton, Russia, and Trump was politically combustible and potentially disruptive to the election. Gina Haspel, then the director of the CIA (a Trump appointment), opposed the document’s release. So did officials at the National Security Agency.

But to Trump and some of his advisers, the memo had a certain expedience. The president seized on it as new evidence of Clinton’s hidden hand in the “Russia hoax,” a subject that reliably caused him to rage against his supposed enemies inside the intelligence agencies.

[Read: Trump vs. the spies]

“It is imperative that the American people now learn what then–Vice President Joe Biden knew about this conspiracy and when he knew it,” the Trump campaign’s communications director said in a statement at the time. “Biden must give a full accounting of his knowledge and his conversations about Clinton’s scheme, which was known to the highest reaches of his administration.”

Trump himself made passing reference to the intelligence in his first debate with Biden, accusing Clinton of “a whole big con job” and the intelligence community of “spying on my campaign.”

Ratcliffe had cherry-picked just the thing to feed Trump’s fixation on “deep state” chicanery and malfeasance. The act was nakedly political. And it surprised no one.

Ratcliffe’s appeal to Trump has always been clear: He’s a political operator willing to push the boundaries of a historically apolitical position in a manner that serves the president’s interests. In November, Trump nominated Ratcliffe for an even more important job than the previous one: CIA director. The question likely to hang over his tenure is how much further he will go to enable Trump’s attacks on the intelligence community.

When Trump nominated Ratcliffe as the DNI in 2019, he gave him marching orders to “rein in” the forces that the president believed were undermining him. “As I think you’ve all learned, the intelligence agencies have run amok,” Trump told reporters. Ratcliffe would get them back in line. But lawmakers were wary of appointing such a staunch partisan, and amid concerns about his experience, Democrats and key Republicans questioned whether he had exaggerated his credentials, something Ratcliffe denied. After only five days, Ratcliffe (who declined to be interviewed for this article) withdrew his candidacy. Trump nominated him again in 2020, and he was narrowly confirmed along party lines, 49–44. He received more votes in opposition than any DNI in the office’s 15-year history.

[Read: Ratcliffe’s withdrawal reveals Trump still doesn’t understand appointments]

When Trump named Ratcliffe as his pick for CIA director, he again made his expectations clear: He praised Ratcliffe for exposing alleged abuses by the FBI and former intelligence officials, and for showing “fake Russian collusion to be a Clinton campaign operation.” But this time, the response in Washington has been muted.

Having served as the DNI for eight months, Ratcliffe is now better qualified to run an intelligence agency. He also benefits from comparison with Trump’s other choices for top national-security positions: at the Pentagon, Pete Hegseth, who has been accused of sexual assault and alcohol abuse (he has denied the allegations); at the FBI, Kash Patel, a fervent Trump supporter who has threatened to investigate the president’s critics, including journalists; and for the DNI, Tulsi Gabbard, a former congresswoman who has expressed sympathy for some of the world’s most notorious anti-American dictators, including Vladimir Putin and Bashar al-Assad.

Compared with these selections, Ratcliffe looks like an elder statesman, and he has essentially been anointed: The Senate will almost certainly confirm him, which will make Ratcliffe the only person ever to have served as both the DNI and the director of the CIA. Several U.S. and allied intelligence officials told me that they would welcome this development, given the alternatives. Patel had been on Trump’s shortlist to run the CIA, some reminded me.

[Read: Trump’s ‘deep state’ revenge]

But the question of where Ratcliffe’s limits lie is even more salient in Trump’s second term. Though the DNI technically ranks higher than the director of the CIA, the latter is the more powerful post. The DNI is largely a managerial job; the CIA director is operational. From Langley, Ratcliffe would control covert intelligence activity. He could learn the locations and identities of spies. The CIA is also the primary interlocutor for foreign intelligence services, which share information that could implicate their sources if exposed. Several foreign intelligence officials have recently told me that they are taking steps to limit how much sensitive intelligence they share with the Trump administration, for fear that it might be leaked or used for political ends.

Some U.S. officials fear that Trump could direct the CIA to undertake illegal activities, such as aiding paramilitary forces inside the United States to secure the border, or clandestinely spying on Americans, knowing that the president would enjoy criminal immunity for official acts thanks to a recent Supreme Court opinion. These are extreme examples, and Trump would surely face internal resistance. But Ratcliffe has demonstrated that he’s willing to break norms and traditions. How would he respond if the president asked—or ordered—him to do something more drastic than declassify documents?

Though Trump has turned to Ratcliffe twice to “rein in” the deep state, his political origin story is actually rooted in the security state’s expansion. After graduating from Notre Dame in 1986, when he was only 20, Ratcliffe went to law school and then into private practice in Texas. “But something was missing,” he told senators at his DNI confirmation hearing. On September 11, 2001, Ratcliffe said, he was at work in a high-rise office building in Dallas that “looked a whole lot like the ones in New York that were under attack”—and he wondered, in the months that followed, how he might devote his time to more meaningful work.

Ratcliffe had gotten to know Matt Orwig, the U.S. attorney for the Eastern District of Texas and a George W. Bush appointee. Orwig needed someone to run a joint terrorism task force, one of the dozens set up after the attacks to coordinate federal and regional security efforts. The goal was not only to prosecute terrorism crimes but to prevent them from happening. Ratcliffe took the job in 2004.

“The whole law-enforcement structure was being remade,” Orwig told me. “There was a lot of information flooding in from different authorities. It was a really big job.” In 2007, Orwig stepped down, and Ratcliffe became U.S. attorney for 11 months. Afterward, he returned to private practice, running the Dallas office of a firm he co-founded with John Ashcroft, Bush’s first attorney general.

Ashcroft became Ratcliffe’s political mentor, an association that seems ironic in retrospect. Ashcroft was in many ways an architect of the powerful national-security bureaucracy that Trump and Ratcliffe now rail against. After 9/11, the attorney general oversaw and approved controversial applications of the PATRIOT Act and other new authorities, including secret wiretapping of phone calls involving Americans. Such counterterrorism measures enhanced the powers of the Justice Department and the intelligence community, and occasionally encroached on civil liberties that Americans had long taken for granted.

Ratcliffe and Ashcroft shared a deeply conservative political outlook, and Ashcroft admired the younger attorney’s commitment to community service. Ratcliffe was also serving as the mayor of Heath, Texas, a bedroom community where he lived with his wife and two children. Ashcroft thought Ratcliffe was suited for national leadership. “We decided he should run for Congress,” Ashcroft told me, and in 2014, Ratcliffe did.

Ratcliffe at his congressional-campaign headquarters in Heath, Texas, March 19, 2014 (Kim Leeson / The Washington Post / Getty)

[Read: The case of John Ashcroft]

Getting to Washington would test Ratcliffe’s budding political skills. Ralph Hall, a conservative Democrat who switched to the GOP in 2004, had reliably represented the fourth congressional district, where Ratcliffe lived, since 1981. At 91, Hall was the oldest-ever member of the House of Representatives, and his voters seemed in no mood to replace him with a young upstart. But the Tea Party was elevating a new generation of conservatives who were suspicious of entrenched power, and in a bid for change that avoided taking aim at Hall’s age, Ratcliffe promised to bring “energetic leadership” to the district. “It’ll be up to the voters to decide whether or not a candidate is too old,” Ratcliffe, who was 42 years younger than Hall, told reporters at the time.

Ratcliffe picked up endorsements from conservative groups, including the Club for Growth, and eventually defeated Hall in a runoff. He was the first primary challenger to beat a Republican incumbent in Texas in 20 years. His political acumen was now beyond dispute, according to Todd Gillman, a reporter for The Dallas Morning News. “Affable. Discreet. Knife fighter,” Gillman wrote in a recent column for The Washington Post. “All of it was there to see when Ratcliffe took down the oldest member of Congress ever without coming off like a jerk.”

In Washington, Ratcliffe discovered the full extent of his talents, which included a lawyerly facility for constructing political narratives that appealed to Republicans. He fell in with fellow conservatives who were also new to Congress. Trey Gowdy, another former federal prosecutor, introduced him to his fellow South Carolinian Tim Scott. The three spent many evenings together, eating dinner and talking about their lives and political ideas.

Gowdy helped Ratcliffe raise his national profile and get Trump’s attention. At a hearing in September 2016, the congressman grilled James Comey, the FBI director, about the investigation of Hillary Clinton’s private email server, questioning whether officials had already decided that there was no prosecutable crime when they sat down to interview the presidential candidate. Ratcliffe was aggressive but not hectoring. His questions were clearly prepared, but his delivery seemed unrehearsed. He corrected Comey’s account of a chain of events in the FBI’s investigation, prompting the director to admit that he might have been misremembering. It wasn’t exactly a gotcha moment, but Ratcliffe showed that he could confuse an adversary with a blizzard of facts.

After Ratcliffe finished with Comey, Gowdy passed him a handwritten note: “100 percent A+.”

“That was really a moment for me where I thought, You know, I’m really where I’m supposed to be,” Ratcliffe recalled in 2021 on a podcast that Gowdy hosts.

Ratcliffe credited Gowdy with steering his career. “You said to me, ‘Johnny, focus on what you do well, get better at it, and shut up about the rest.’ And I literally followed that advice. In other words, only go on TV to talk about things that you know about. Don’t try and be a master of all trades. Do the things that you do really well and people will notice, and it will serve you well. And it did.”

Gowdy helped make Ratcliffe a go-to interrogator when congressional committees wanted to quiz the FBI or poke holes in the Russia investigation. Ratcliffe stuck to a theme of pernicious bias against Trump. He suggested that political animus, not genuine concern about foreign-intelligence threats, was the impetus behind the Russia probe. He also suggested that the CIA—the agency he is about to lead—may have kicked off the investigation. (It did not, and this is among the fringiest views that Ratcliffe has flirted with.)

[Read: Don’t let the Russia probe become the new Benghazi]

Ratcliffe’s performances impressed Trump. But although he, Gowdy, and Scott are deeply conservative, they are not MAGA Republicans. They seem to share Trump’s antipathy toward the federal bureaucracy. But their political ideas were shaped by forces that gave rise to Trump, not by the man himself. Gowdy, who left Congress in 2019, got on Trump’s bad side for not embracing his conspiracy theories about Democrats spying on his campaign, and Scott competed against Trump in the GOP’s 2024 presidential primary.

As for Ratcliffe, he has more fiercely defended Trump as a victim of an unfair system than championed him as a hero sent to fix it. In one of the most-watched hearings of the Trump era, Ratcliffe lit into Special Counsel Robert Mueller and the language of his final report, which stated that although the investigation “does not conclude that the President committed a crime, it also does not exonerate him.” That was an unfair standard no American should face, Ratcliffe insisted. “Donald Trump is not above the law,” he thundered. “But he damn sure shouldn’t be below the law.”

It was a principled position, and perhaps a reflection of sincere disquiet about the politicization of law enforcement and the intelligence community. Ashcroft told me that he shares such concerns and speaks with Ratcliffe four or five times a year about reforming the system. But when Ratcliffe takes these stances, he also gives credence to Trump’s refrains about “Crooked Hillary” and the deep state. And he makes little effort to distinguish Trump’s critique from his own.

Jim Jordan speaks to Ratcliffe during a House Judiciary Committee hearing, December 9, 2019. (Zach Gibson / Getty)

[Read: Republicans take their shot at Mueller—and narrowly miss]

Ratcliffe probably wouldn’t have become the director of national intelligence if not for another pro-Trump partisan, Richard Grenell. The then-ambassador to Germany was also serving as the acting intelligence director when Trump nominated Ratcliffe for the second time, in 2020. The president essentially forced the Senate to choose between the two. Grenell had long been loathed and even feared in some quarters of Congress for his heated rhetoric and vicious social-media attacks. Suddenly, Ratcliffe seemed like the less political option.

Ratcliffe took office less than six months before the 2020 election. The intelligence agencies he now led were on guard against foreign governments trying to skew political contests with misleading social-media posts and divisive propaganda. Russia, once again, was a top concern.

Nothing angered Trump like talk of Russia trying to help him win an election. His aides had learned to avoid the subject. The president had identified China as the biggest strategic threat to the United States, an assessment that many Democrats and Republicans shared, Ratcliffe among them. But career intelligence analysts doubted that China intended to disrupt the election. What Beijing really wanted was stability in its relationship with Washington, they argued. Trying to help one candidate win, as Russia had in 2016, could backfire.

[Read: Trump’s intelligence war is also an election story]

In August 2020, the intelligence community produced a classified assessment of election threats. Then Ratcliffe intervened, analysts have said, and inserted a warning about China that was an “outrageous misrepresentation of their analysis,” according to a later report by an intelligence ombudsman.

The DNI typically does not help write intelligence assessments, because he is a political appointee, and so his involvement could present a conflict of interest. But Ratcliffe argued that although his intervention was unusual, it was not unprecedented, nor was it inappropriate. He maintained that the analysts were thinking too narrowly: China’s well-documented efforts to lobby state and local officials, and to steal corporate intellectual property and classified government information, were aimed at achieving political outcomes. That made them, in effect, a kind of election interference. The ombudsman also found that the analysts working on China and the ones working on Russia used different definitions for influence and interference. Ratcliffe argued that such discrepancies could create the false impression that Russia was trying to affect the U.S. election but China was not.

“I know my conclusions are right, based on the intelligence that I see,” he said, according to the ombudsman. “Many analysts think I am going off the script. They don’t realize that I did it based on the intelligence.”

Ratcliffe’s defenders say that his role as the DNI obligated him to speak up, even if that meant straying into red-hot political topics. “What I saw was him reflecting a value of transparency and informing the public,” said one U.S. intelligence official who worked for Ratcliffe when he was the DNI and asked not to be identified by name. “Sometimes he would challenge assessments and assumptions, I think in the interest of seeing if they would hold. He is an attorney by trade. You kind of have to keep that in mind when you brief him.”

Ratcliffe wasn’t the only one to gauge the threat from China more broadly: Two senior intelligence officers also expressed views on China’s interference activities that were in line with Ratcliffe’s assessment. But Ratcliffe didn’t raise the same level of concern about Russia, which many analysts thought posed the more direct threat to the election. He framed the issue, not for the first time, in a way that lent support to Trump’s political argument. And because the DNI was making that case, the ostensibly objective work of intelligence now had a partisan gloss.

Ratcliffe leaving a meeting with Senate Minority Whip John Thune after being nominated to be the CIA director, December 4, 2024 (Andrew Harnik / Getty)

[Read: Trump calls out election meddling—by China]

When announcing Ratcliffe’s nomination for CIA director, Trump indicated what he valued most in his pick: From “exposing” the Russia investigation as the alleged handiwork of the Clinton campaign to catching the FBI’s abuse of Civil Liberties at the FISA Court, John Ratcliffe has always been a warrior for Truth and Honesty with the American public,” Trump wrote in a social-media post. The reference to the Foreign Intelligence Surveillance Court was shorthand for one of Trump’s elastic theories about how Democrats had spied on his 2016 campaign.

He also lauded Ratcliffe for publicly refuting 51 former intelligence officers who had claimed in a letter that the 2020 discovery of emails on a laptop purporting to belong to Joe Biden’s son Hunter had “all the classic earmarks of a Russian information operation.” Ratcliffe was right about that one: No evidence linked Hunter Biden’s laptop to a Russian plot to harm his father. But the letter by the former officials was an act of free speech and an expression of opinion by former officials and experts—not something that the DNI traditionally makes his business.

In the four years he has been out of government, Ratcliffe has remained an enthusiastic critic of the intelligence community. He co-authored a September 2023 op-ed in The Wall Street Journal with a former aide, reflecting on “a dangerous trend inside the CIA to politicize intelligence on China, and to suppress dissenting views that stray from the company line.” He was particularly worried about resistance to investigating the origins of the coronavirus pandemic. The once-fringe view that the virus likely originated in a laboratory in China, which Ratcliffe believes, has gained more respectability thanks in part to U.S. intelligence.

[Read: The coronavirus conspiracy boom]

Tim Scott told me that Ratcliffe’s controversial positions have aged well. “Some of the time he stood alone or in the minority and took a scathing rebuke from the intellectuals in our country,” the senator said. “I think the truth of the matter is, he was right—about the origins of COVID, the Biden laptop, and Russiagate.”

In other scenarios, however—the memo about the Clinton campaign and Russian hacking comes to mind—Ratcliffe conducted himself less like an intelligence adviser, who is supposed to help the president make a decision, and more like a litigator doing his best to help his client win an argument, or a political pugilist eager to score points.

Still, unlike some others in Trump’s orbit—most notably Kash Patel—Ratcliffe has shown that he does have limits. Shortly after the 2020 election, Trump offered Ratcliffe the job that he had long wanted, and that his friend Trey Gowdy had said he was perfect for: attorney general. The president was prepared to fire Bill Barr, who’d rejected Trump’s baseless notions of widespread voter fraud. According to an account in Michael Bender’s book, Frankly, We Did Win This Election: The Inside Story of How Trump Lost, Ratcliffe had privately told Trump that no intelligence suggested that foreign governments had hacked voting machines or changed the outcome of the election. If he became attorney general, he’d be expected to advocate for an idea he knew wasn’t true. Ratcliffe declined Trump’s offer.

In this respect, Ratcliffe might seem like one of the so-called adults in the room during the first Trump administration—the officials who slow-rolled orders or even tried to block them as a check against what they considered to be the president’s worst impulses. But people who know Ratcliffe told me that this was not his profile. He is on board with Trump’s policies and doesn’t believe that regulating the president is his job. He won’t cross his boss, either. To this day, nearly eight years after the CIA, FBI, and NSA reached a unanimous, unclassified assessment on Russian election interference in 2016, Ratcliffe has never said publicly whether he agrees with one of its key findings: that the Russians were trying to help Trump win.

[Read: The U.S. needs to face up to its long history of election meddling]

If he disagrees with that position, he surely would have said so, just as he has disputed other intelligence judgments he finds lacking or wrong. But his silence is telling. If he does agree, and says so publicly, he will not be the next director of the CIA.

At his confirmation hearing, senators are likely to ask Ratcliffe whether he plans to further Trump’s interests. Not the president’s policies—all CIA directors do that—but his political preferences, prejudices, and vendettas. Only Ratcliffe knows the answer to this question. But alone among Trump’s picks to head the national-security agencies, he comes with a clear track record in the role.