Itemoids

Princeton University

The Rise of the Brown v. Board of Education Skeptics

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 04 › brown-v-board-of-education-integrated-noliwe-rooks-book › 681766

This story seems to be about:

On May 17, 1954, a nervous 45-year-old lawyer named Thurgood Marshall took a seat in the Supreme Court’s gallery. The founder and director of the NAACP Legal Defense and Educational Fund hoped to learn that he had prevailed in his pivotal case. When Chief Justice Earl Warren announced the Court’s opinion in Brown v. Board of Education, Marshall could not have known that he had also won what is still widely considered the most significant legal decision in American history. Hearing Warren declare “that in the field of public education the doctrine of ‘separate but equal’ has no place” delivered Marshall into a state of euphoria. “I was so happy, I was numb,” he said. After exiting the courtroom, he joyously swung a small boy atop his shoulders and galloped around the austere marble hall. Later, he told reporters, “It is the greatest victory we ever had.”

For Marshall, the “we” who triumphed in Brown surely referred not only, or even primarily, to himself and his Legal Defense Fund colleagues, but to the entire Black race, on whose behalf they’d toiled. And Black Americans did indeed find Brown exhilarating. Harlem’s Amsterdam News, echoing Marshall, called Brown “the greatest victory for the Negro people since the Emancipation Proclamation.” W. E. B. Du Bois stated, “I have seen the impossible happen. It did happen on May 17, 1954.” When Oliver Brown learned of the outcome in the lawsuit bearing his surname, he gathered his family near, and credited divine providence: “Thanks be to God for this.” Martin Luther King Jr. encouraged Montgomery’s activists in 1955 by invoking Brown: “If we are wrong, then the Supreme Court of this nation is wrong. If we are wrong, the Constitution of the United States is wrong. If we are wrong, God Almighty is wrong.” Many Black people viewed the opinion with such awe and reverence that for years afterward, they threw parties on May 17 to celebrate Brown’s anniversary.

Over time, however, some began questioning what exactly made Brown worthy of celebration. In 1965, Malcolm X in his autobiography voiced an early criticism of Brown: It had yielded precious little school desegregation over the previous decade. Calling the decision “one of the greatest magical feats ever performed in America,” he contended that the Court’s “masters of legal phraseology” had used “trickery and magic that told Negroes they were desegregated—Hooray! Hooray!—and at the same time … told whites ‘Here are your loopholes.’ ”

[Read: The children who desegregated America’s schools]

But that criticism paled in comparison with the anti-Brown denunciation in Stokely Carmichael and Charles Hamilton’s Black Power: The Politics of Liberation two years later. They condemned not Brown’s implementation, but its orientation. The fundamental aim of integration must be abandoned because it was driven by the “assumption that there is nothing of value in the black community,” they maintained.

To sprinkle black children among white pupils in outlying schools is at best a stop-gap measure. The goal is not to take black children out of the black community and expose them to white middle-class values; the goal is to build and strengthen the black community.

Although Black skeptics of the integration ideal originated on the far left, Black conservatives—including the economist Thomas Sowell—have more recently ventured related critiques. The most prominent example is Marshall’s successor on the Supreme Court, Justice Clarence Thomas. In 1995, four years after joining the Court, Thomas issued a blistering opinion that opened, “It never ceases to amaze me that the courts are so willing to assume that anything that is predominantly black must be inferior.”

Desperate efforts to promote school integration, Thomas argued, stemmed from the misperception that identifiably Black schools were somehow doomed to fail because of their racial composition. “There is no reason to think that black students cannot learn as well when surrounded by members of their own race as when they are in an integrated environment,” he wrote. Taking a page from Black Power’s communal emphasis, Thomas argued that “black schools can function as the center and symbol of black communities, and provide examples of independent black leadership, success, and achievement.” In a 2007 opinion, he extolled Washington, D.C.’s all-Black Dunbar High School—which sent dozens of graduates to the Ivy League and its ilk during the early 20th century—as a paragon of Black excellence.

In the 2000s, as Brown crept toward its 50th anniversary, Derrick Bell of the NYU School of Law went so far as to allege that the opinion had been wrongly decided. For Bell, who had sharpened his skills as an LDF lawyer, Brown’s “integration ethic centralizes whiteness. White bodies are represented as somehow exuding an intrinsic value that percolates into the ‘hearts and minds’ of black children.” Warren’s opinion in the case should have affirmed Plessy v. Ferguson’s “separate but equal” regime, Bell wrote, but it should have insisted on genuine equality of expenditures, rather than permitting the sham equality of yore that consigned Black students to shoddy classrooms in dilapidated buildings. He acknowledged, though, that his jaundiced account put him at odds with dominant American legal and cultural attitudes: “The Brown decision,” he noted, “has become so sacrosanct in law and in the beliefs of most Americans that any critic is deemed wrongheaded, even a traitor to the cause.”

In her New Book, Integrated: How American Schools Failed Black Children, Noliwe Rooks adds to a growing literature that challenges the portrayal of the decision as “a significant civil rights–era win.” Rooks, the chair of the Africana-​studies department at Brown University, offers an unusual blend of historical examination and family memoir that generally amplifies the concerns articulated by prior desegregation discontents. The result merits careful attention not for its innovative arguments, but as an impassioned, arresting example of how Brown skepticism, which initially gained traction on the fringes of Black life, has come to hold considerable appeal within the Black intellectual mainstream.

As recently as midway through the first Trump administration, Rooks would have placed herself firmly in the traditional pro-Brown camp, convinced that addressing racial inequality in education could best be pursued through integration. But traveling a few years ago to promote a book that criticized how private schools often thwart meaningful racial integration, she repeatedly encountered audience members who disparaged her core embrace of integration. Again and again, she heard from Black parents that “the trauma their children experienced in predominantly white schools and from white teachers was sometimes more harmful than the undereducation occurring in segregated schools.”

[From the May 2018 issue: The report on race that shook America]

The onslaught dislodged Rooks’s faith in the value of contemporary integration, and even of Brown itself. She now exhibits the convert’s zeal. Brown, she writes, should be viewed as “an attack on Black schools, politics, and communities, which meant it was an attack on the pillars of Black life.” For some Black citizens, the decision acted as “a wrecking ball that crashed through their communities and, like a pendulum, continues to swing.”

Rooks emphasizes the plight of Black educators, who disproportionately lost their positions in Brown’s aftermath because of school consolidations. Before Brown, she argues, “Black teachers did not see themselves as just teaching music, reading, or science, but also as activists, organizers, and freedom fighters who dreamed of and fought for an equitable world for future generations”; they served as models who showed “Black children how to fight for respect and societal change.”

Endorsing one of Black Power’s analogies, she maintains that school integration meant that “as small a number as possible of Black children were, like pepper on popcorn, lightly sprinkled atop wealthy, white school environments, while most others were left behind.” Even for those ostensibly fortunate few flecks of pepper, Rooks insists, providing the white world’s seasoning turned out to be a highly uncertain, dangerous endeavor. She uses her father’s disastrous experiences with integration to examine what she regards as the perils of the entire enterprise. After excelling in all-Black educational environments, including as an undergraduate at Howard University, Milton Rooks became one of a very small number of Black students to enroll at the Golden Gate University School of Law in the early 1960s.

Sent by his hopeful parents “over that racial wall,” Milton encountered hostility from white professors, who doubted his intellectual capacity, Rooks recounts, and “spit him back up like a piece of meat poorly digested.” She asserts that the ordeal not only prompted him to drop out of law school but also spurred his descent into alcoholism. Rooks extrapolates further, writing:

Milton’s experience reflected the trauma Black students suffered as they desegregated public schools in states above the Mason-Dixon Line, where displays of racism were often mocking, disdainful, pitying, and sword sharp in their ability to cut the unsuspecting into tiny bits. It destroyed confidence, shook will, sowed doubt, murdered souls—quietly, sure, but still as completely as could a mob of white racists setting their cowardice, rage, and anger loose upon the defenseless.

The harms that contemporary integrated educational environments inflict upon Black students can be tantamount, in her view, to the harms imposed upon the many Black students who are forced to attend monoracial, woeful urban high schools. To make this point, Rooks recounts her own struggle to correct the misplacement of her son, Jelani, in a low-level math class in Princeton, New Jersey’s public-school system during the aughts (when she taught at Princeton University). She witnessed other Black parents meet with a similar lack of support in guiding their children to the academically demanding courses that could propel them to elite colleges. In Jelani’s case, she had evidence that teachers’ “feelings were hardening against him.” He led a life of relative safety and economic privilege, and felt at ease among his white classmates and friends, she allows, even as she also stresses that what he “experienced wasn’t the violence of poverty; it was something else equally devastating”:

We knew that poor, working-class, or urban communities were not the only places where Black boys are terrorized and traumatized. We knew that the unfamiliarity of his white friends with any other Black people would one day become an issue in our home. We knew that guns were not the only way to murder a soul.

Frustrated with Princeton’s public schools, Rooks eventually enrolled Jelani in an elite private high school where, she notes, he also endured racial harassment—and from which he graduated before making his way to Amherst College.

seven decades have now elapsed since the Supreme Court’s decision in Brown. Given the stubbornly persistent phenomenon of underperforming predominantly Black schools throughout the nation, arguing that Brown’s potential has been fully realized would be absurd. Regrettably, the Warren Court declined to advance the most powerful conception of Brown when it had the opportunity to do so: Its infamously vague “all deliberate speed” approach allowed state and local implementation to be delayed and opposed for far too long. In its turn, the Burger Court provided an emaciated conception of Brown’s meaning, one that permitted many non-southern jurisdictions to avoid pursuing desegregation programs. Rooks deftly sketches this lamentable, sobering history.

[From the May 2014 issue: Segregation now ...]

Disenchantment with Brown’s educational efficacy is thus entirely understandable. Yet to suggest that the Supreme Court did not go far enough, fast enough in galvanizing racially constructive change in American schools after Brown is one thing. To suggest that Brown somehow took a wrong turn is quite another.

Rooks does not deny that integration succeeded in narrowing the racial achievement gap. But like other Brown critics, she nevertheless idealizes the era of racial segregation. Near Integrated  ’s conclusion, Rooks contends that “too few of us have a memory of segregated Black schools as the beating heart of vibrant Black communities, enabling students to compose lives of harmony, melody, and rhythm and sustained Black life and dignity.” But this claim gets matters exactly backwards. The brave people who bore segregation’s brunt believed that Jim Crow represented an assault on Black life and dignity, and that Brown marked a sea change in Black self-conceptions.

Desegregation’s detractors routinely elevate the glory days of D.C.’s Dunbar High School, but they refuse to heed the lessons of its most distinguished graduates. Charles Hamilton Houston—Dunbar class of 1911, who went on to become valedictorian at Amherst and the Harvard Law Review’s first Black editor—nevertheless dedicated his life to eradicating Jim Crow as an NAACP litigator and Thurgood Marshall’s mentor in his work contesting educational segregation. Sterling A. Brown—Dunbar class of 1918, who graduated from Williams College before becoming a distinguished poet and professor—nevertheless wrote the following in 1944, one decade before Brown:

Negroes recognize that the phrase “equal but separate accommodations” is a myth. They have known Jim Crow a long time, and they know Jim Crow means scorn and not belonging.

Much as they valued having talented, caring teachers, these men understood racial segregation intimately, and they detested it.

In the 1990s, Nelson B. Rivers III, an unheralded NAACP official from South Carolina, memorably heaved buckets of cold water on those who were beginning to wonder, “ Was integration the right thing to do? Was it worth it? Was Brown a good decision?” Rivers dismissed such questions as “asinine,” and continued:

To this day, I can remember bus drivers pulling off and blowing smoke in my mother’s face. I can remember the back of the bus, colored water fountains … I can hear a cop telling me, “Take your black butt back to nigger town.” What I tell folk … is that there are a lot of romanticists now who want to take this trip down Memory Lane, and they want to go back, and I tell the young people that anybody who wants to take you back to segregation, make sure you get a round-trip ticket because you won’t stay.

Nostalgia for the pre-Brown era would not exercise nearly so powerful a grip on Black America today if its adherents focused on its detailed, pervasive inhumanities rather than relying on gauzy glimpses.

No one has pressed this point more vividly than Robert L. Carter, who worked alongside Marshall at the LDF before eventually becoming a distinguished federal judge. He understood that to search for Brown’s impact exclusively in the educational domain is mistaken. Instead, he emphasized that Brown fomented a broad-gauge racial revolution throughout American public life. Despite Chief Justice Warren formally writing the opinion to apply exclusively to education, its attack on segregation has—paradoxically—been most efficacious beyond that original context.

[From the October 1967 issue: Jonathan Kozol’s ‘Where Ghetto Schools Fail’]

“The psychological dimensions of America’s race relations problem were completely recast” by Brown, Carter wrote. “Blacks were no longer supplicants seeking, pleading, begging to be treated as full-fledged members of the human race; no longer were they appealing to morality, to conscience, to white America’s better instincts,” he noted. “They were entitled to equal treatment as a right under the law; when such treatment was denied, they were being deprived—in fact robbed—of what was legally theirs. As a result, the Negro was propelled into a stance of insistent militancy.”

Even within the educational sphere, though, it is profoundly misguided to claim that Black students who attend solid, meaningfully integrated schools encounter environments as corrosive as, or worse than, those facing students trapped in ghetto schools. This damned-if-you-do, damned-if-you-don’t analysis suggests an entire cohort stuck in the same boat, when its many members are not even in the same ocean. The Black student marooned in a poor and violent neighborhood, with reason to fear actual murder, envies the Black student attending a rigorous, integrated school who worries about metaphorical “soul murder.” All struggles are not created equal.

This article appears in the April 2025 print edition with the headline “Was Integration the Wrong Goal?”

Grover Cleveland’s Warning for Trump

The Atlantic

www.theatlantic.com › newsletters › archive › 2025 › 01 › grover-clevelands-warning-for-trump › 681425

This is an edition of Time-Travel Thursdays, a journey through The Atlantic’s archives to contextualize the present and surface delightful treasures. Sign up here.

Donald Trump is now the second president to return to the White House after losing a bid for reelection. The first was Grover Cleveland, who ran a successful campaign in 1884 and 1892. I spoke with my colleague Russell Berman about his recent story on Cleveland’s legacy, the ways Trump’s win may reshape it, and how an electoral loss can become a political advantage.

Stephanie Bai: In your recent story, you wrote that some of Grover Cleveland’s fans aren’t too pleased with the comparisons being made between him and Donald Trump. But one similarity that struck me is how both Trump and Cleveland campaigned on the image of being political outsiders to connect with working-class voters—even though Cleveland co-owned a successful law practice and Trump’s return to office has been supported by titans of industry.

Did their initial electoral loss and the subsequent four-year gap between campaigns give any credence to their political-outsider narratives?

Russell Berman: Certainly for Trump, I think that is true. He was able to stand on the sidelines for the past four years and criticize former President Joe Biden for basically everything. Trump blamed him for inflation and made voters think more rosily about his first term than they did while he was in office. And he repeated what he had done successfully in 2015 and 2016, which was to position himself as an outsider—except back then, he really was an outsider to the political system.

Cleveland did that, too, to a lesser extent. By not being in office for four years, he was able to run as an outsider. Similarly to Trump, that’s what he had done earlier in his political career. Even in his runs for office for mayor of Buffalo and then for governor of New York, he was seen as the reluctant candidate. There’s some debate about whether that was true or if he just wanted voters to think that, but he was able to position himself as this anti-corruption populist. And unlike Trump, he actually followed through on his commitment to clean government once in office.

Stephanie: At his inauguration, Trump said he was “saved by God to make America great again” and serve another term. Do you think that his historic political comeback will affect the direction of his presidency?

Russell: Trump has always had this desire to resist any constraints on him and on the presidency. This is also what separates him from Cleveland, and probably what will end up separating their second terms. Cleveland adhered to the constitutional limits on the presidency. He didn’t try to expand the power of the presidency in the way that Trump has already in his second term, with his early executive orders going after birthright citizenship and trying to refuse to spend money appropriated by Congress. Trump is going to see how much he can get away with and what kind of resistance, if any, he’ll face within the Republican Party or in the courts.

But Cleveland’s comeback turned sour soon after he returned to the White House. His second term was marred by a very deep recession. The economy obviously is pretty strong right now, as we speak, but that can change quickly—especially because some economists are concerned about what Trump’s tariffs could do. So there is a warning for Trump in Cleveland’s story because Cleveland’s second term, similar to a lot of presidential second terms, was much rougher than his first.

Read More

The president Trump is pushing aside: Grover Cleveland enthusiasts aren’t thrilled, Russell Berman reports. The lessons of 1884: When Grover Cleveland clinched the Democratic nomination and faced an allegation of misconduct, he created a new political playbook, Susan Wise Bauer writes. The independence of the executive: In an address to Princeton University published in 1900, Grover Cleveland spoke about the history and political deliberations surrounding his former office. Attempts to undo a presidential legacy: Benjamin Harrison, in the twilight of his presidency, sent a treaty to the Senate to advance the annexation of Hawai‘i. Weeks later, Cleveland’s first act as president was to withdraw that treaty and order an investigation of the American-led overthrow of the Hawaiian Kingdom.

The Anti-Social Century

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 02 › american-loneliness-personality-politics › 681091

This story seems to be about:

Illustrations by Max Guther

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

The Bar Is Closed

A short drive from my home in North Carolina is a small Mexican restaurant, with several tables and four stools at a bar facing the kitchen. On a sweltering afternoon last summer, I walked in with my wife and daughter. The place was empty. But looking closer, I realized that business was booming. The bar was covered with to-go food: nine large brown bags.

As we ate our meal, I watched half a dozen people enter the restaurant without sitting down to eat. Each one pushed open the door, walked to the counter, picked up a bag from the bar, and left. In the delicate choreography between kitchen and customer, not a word was exchanged. The space once reserved for that most garrulous social encounter, the bar hangout, had been reconfigured into a silent depot for customers to grab food to eat at home.

Until the pandemic, the bar was bustling and popular with regulars. “It’s just a few seats, but it was a pretty happening place,” Rae Mosher, the restaurant’s general manager, told me. “I can’t tell you how sad I’ve been about it,” she went on. “I know it hinders communications between customers and staff to have to-go bags taking up the whole bar. But there’s nowhere else for the food to go.” She put up a sign: BAR SEATING CLOSED.

The sign on the bar is a sign of the times for the restaurant business. In the past few decades, the sector has shifted from tables to takeaway, a process that accelerated through the pandemic and continued even as the health emergency abated. In 2023, 74 percent of all restaurant traffic came from “off premises” customers—that is, from takeout and delivery—up from 61 percent before COVID, according to the National Restaurant Association.

The flip side of less dining out is more eating alone. The share of U.S. adults having dinner or drinks with friends on any given night has declined by more than 30 percent in the past 20 years. “There’s an isolationist dynamic that’s taking place in the restaurant business,” the Washington, D.C., restaurateur Steve Salis told me. “I think people feel uncomfortable in the world today. They’ve decided that their home is their sanctuary. It’s not easy to get them to leave.” Even when Americans eat at restaurants, they are much more likely to do so by themselves. According to data gathered by the online reservations platform OpenTable, solo dining has increased by 29 percent in just the past two years. The No. 1 reason is the need for more “me time.”

The evolution of restaurants is retracing the trajectory of another American industry: Hollywood. In the 1930s, video entertainment existed only in theaters, and the typical American went to the movies several times a month. Film was a necessarily collective experience, something enjoyed with friends and in the company of strangers. But technology has turned film into a home delivery system. Today, the typical American adult buys about three movie tickets a year—and watches almost 19 hours of television, the equivalent of roughly eight movies, on a weekly basis. In entertainment, as in dining, modernity has transformed a ritual of togetherness into an experience of homebound reclusion and even solitude.

The privatization of American leisure is one part of a much bigger story. Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965. Between that year and the end of the 20th century, in-person socializing slowly declined. From 2003 to 2023, it plunged by more than 20 percent, according to the American Time Use Survey, an annual study conducted by the Bureau of Labor Statistics. Among unmarried men and people younger than 25, the decline was more than 35 percent. Alone time predictably spiked during the pandemic. But the trend had started long before most people had ever heard of a novel coronavirus and continued after the pandemic was declared over. According to Enghin Atalay, an economist at the Federal Reserve Bank of Philadelphia, Americans spent even more time alone in 2023 than they did in 2021. (He categorized a person as “alone,” as I will throughout this article, if they are “the only person in the room, even if they are on the phone” or in front of a computer.)

Eroding companionship can be seen in numerous odd and depressing facts of American life today. Men who watch television now spend seven hours in front of the TV for every hour they spend hanging out with somebody outside their home. The typical female pet owner spends more time actively engaged with her pet than she spends in face-to-face contact with friends of her own species. Since the early 2000s, the amount of time that Americans say they spend helping or caring for people outside their nuclear family has declined by more than a third.

[Derek Thompson: Why Americans suddenly stopped hanging out]

Self-imposed solitude might just be the most important social fact of the 21st century in America. Perhaps unsurprisingly, many observers have reduced this phenomenon to the topic of loneliness. In 2023, Vivek Murthy, Joe Biden’s surgeon general, published an 81-page warning about America’s “epidemic of loneliness,” claiming that its negative health effects were on par with those of tobacco use and obesity. A growing number of public-health officials seem to regard loneliness as the developed world’s next critical public-health issue. The United Kingdom now has a minister for loneliness. So does Japan.

Max Guther

But solitude and loneliness are not one and the same. “It is actually a very healthy emotional response to feel some loneliness,” the NYU sociologist Eric Klinenberg told me. “That cue is the thing that pushes you off the couch and into face-to-face interaction.” The real problem here, the nature of America’s social crisis, is that most Americans don’t seem to be reacting to the biological cue to spend more time with other people. Their solitude levels are surging while many measures of loneliness are actually flat or dropping. A 2021 study of the widely used UCLA Loneliness Scale concluded that “the frequently used term ‘loneliness epidemic’ seems exaggerated.” Although young people are lonelier than they once were, there is little evidence that loneliness is rising more broadly today. A 2023 Gallup survey found that the share of Americans who said they experienced loneliness “a lot of the day yesterday” declined by roughly one-third from 2021 to 2023, even as alone time, by Atalay’s calculation, rose slightly.

Day to day, hour to hour, we are choosing this way of life—its comforts, its ready entertainments. But convenience can be a curse. Our habits are creating what Atalay has called a “century of solitude.” This is the anti-social century.

Over the past few months, I’ve spoken with psychologists, political scientists, sociologists, and technologists about America’s anti-social streak. Although the particulars of these conversations differed, a theme emerged: The individual preference for solitude, scaled up across society and exercised repeatedly over time, is rewiring America’s civic and psychic identity. And the consequences are far-reaching—for our happiness, our communities, our politics, and even our understanding of reality.

The End of the Social Century

The first half of the 20th century was extraordinarily social. From 1900 to 1960, church membership surged, as did labor-union participation. Marriage rates reached a record high after World War II, and the birth rate enjoyed a famous “boom.” Associations of all sorts thrived, including book clubs and volunteer groups. The New Deal made America’s branch-library system the envy of the world; communities and developers across the country built theaters, music venues, playgrounds, and all kinds of gathering places.

But in the 1970s, the U.S. entered an era of withdrawal, as the political scientist Robert D. Putnam famously documented in his 2000 book, Bowling Alone. Some institutions of togetherness, such as marriage, eroded slowly. Others fell away swiftly. From 1985 to 1994, active involvement in community organizations fell by nearly half. The decline was astonishingly broad, affecting just about every social activity and every demographic group that Putnam tracked.

What happened in the 1970s? Klinenberg, the sociologist, notes a shift in political priorities: The government dramatically slowed its construction of public spaces. “Places that used to anchor community life, like libraries and school gyms and union halls, have become less accessible or shuttered altogether,” he told me. Putnam points, among other things, to new moral values, such as the embrace of unbridled individualism. But he found that two of the most important factors were by then ubiquitous technologies: the automobile and the television set.

Starting in the second half of the century, Americans used their cars to move farther and farther away from one another, enabling the growth of the suburbs and, with it, a retreat into private backyard patios, private pools, a more private life. Once Americans got out of the car, they planted themselves in front of the television. From 1965 to 1995, the typical adult gained six hours a week in leisure time. They could have devoted that time—300 hours a year!—to community service, or pickup basketball, or reading, or knitting, or all four. Instead, they funneled almost all of this extra time into watching more TV.

Television transformed Americans’ interior decorating, our relationships, and our communities. In 1970, just 6 percent of sixth graders had a TV set in their bedroom; in 1999, that proportion had grown to 77 percent. Time diaries in the 1990s showed that husbands and wives spent almost four times as many hours watching TV together as they spent talking to each other in a given week. People who said TV was their “primary form of entertainment” were less likely to engage in practically every social activity that Putnam counted: volunteering, churchgoing, attending dinner parties, picnicking, giving blood, even sending greeting cards. Like a murder in Clue, the death of social connections in America had any number of suspects. But in the end, I believe the likeliest culprit is obvious. It was Mr. Farnsworth, in the living room, with the tube.

Phonebound

If two of the 20th century’s iconic technologies, the automobile and the television, initiated the rise of American aloneness, the 21st century’s most notorious piece of hardware has continued to fuel, and has indeed accelerated, our national anti-social streak. Countless books, articles, and cable-news segments have warned Americans that smartphones can negatively affect mental health and may be especially harmful to adolescents. But the fretful coverage is, if anything, restrained given how greatly these devices have changed our conscious experience. The typical person is awake for about 900 minutes a day. American kids and teenagers spend, on average, about 270 minutes on weekdays and 380 minutes on weekends gazing into their screens, according to the Digital Parenthood Initiative. By this account, screens occupy more than 30 percent of their waking life.

Some of this screen time is social, after a fashion. But sharing videos or texting friends is a pale imitation of face-to-face interaction. More worrisome than what young people do on their phone is what they aren’t doing. Young people are less likely than in previous decades to get their driver’s license, or to go on a date, or to have more than one close friend, or even to hang out with their friends at all. The share of boys and girls who say they meet up with friends almost daily outside school hours has declined by nearly 50 percent since the early 1990s, with the sharpest downturn occurring in the 2010s.

Max Guther

The decline of hanging out can’t be shrugged off as a benign generational change, something akin to a preference for bell-bottoms over skinny jeans. Human childhood—including adolescence—is a uniquely sensitive period in the whole of the animal kingdom, the psychologist Jonathan Haidt writes in The Anxious Generation. Although the human brain grows to 90 percent of its full size by age 5, its neural circuitry takes a long time to mature. Our lengthy childhood might be evolution’s way of scheduling an extended apprenticeship in social learning through play. The best kind of play is physical, outdoors, with other kids, and unsupervised, allowing children to press the limits of their abilities while figuring out how to manage conflict and tolerate pain. But now young people’s attention is funneled into devices that take them out of their body, denying them the physical-world education they need.

[Read: Jonathan Haidt on the terrible costs of a phone-based childhood]

Teen anxiety and depression are at near-record highs: The latest government survey of high schoolers, conducted in 2023, found that more than half of teen girls said they felt “persistently sad or hopeless.” These data are alarming, but shouldn’t be surprising. Young rats and monkeys deprived of play come away socially and emotionally impaired. It would be odd if we, the self-named “social animal,” were different.

Socially underdeveloped childhood leads, almost inexorably, to socially stunted adulthood. A popular trend on TikTok involves 20‑somethings celebrating in creative ways when a friend cancels plans, often because they’re too tired or anxious to leave the house. These clips can be goofy and even quite funny. Surely, sympathy is due; we all know the feeling of relief when we claw back free time in an overscheduled week. But the sheer number of videos is a bit unsettling. If anybody should feel lonely and desperate for physical-world contact, you’d think it would be 20-somethings, who are still recovering from years of pandemic cabin fever. But many nights, it seems, members of America’s most isolated generation aren’t trying to leave the house at all. They’re turning on their cameras to advertise to the world the joy of not hanging out.

If young adults feel overwhelmed by the emotional costs of physical-world togetherness—and prone to keeping even close friends at a physical distance—that suggests that phones aren’t just rewiring adolescence; they’re upending the psychology of friendship as well.

[From the September 2017 issue: Have smartphones destroyed a generation?]

In the 1960s, Irwin Altman, a psychologist at the Naval Medical Research Institute, in Bethesda, Maryland, co-developed a friendship formula characterized by increasing intimacy. In the early stages of friendship, people engage in small talk by sharing trivial details. As they develop trust, their conversations deepen to include more private information until disclosure becomes habitual and easy. Altman later added an important wrinkle: Friends require boundaries as much as they require closeness. Time alone to recharge is essential for maintaining healthy relationships.

Phones mean that solitude is more crowded than it used to be, and crowds are more solitary. “Bright lines once separated being alone and being in a crowd,” Nicholas Carr, the author of the new book Superbloom: How Technologies of Connection Tear Us Apart, told me. “Boundaries helped us. You could be present with your friends and reflective in your downtime.” Now our social time is haunted by the possibility that something more interesting is happening somewhere else, and our downtime is contaminated by the streams and posts and texts of dozens of friends, colleagues, frenemies, strangers.

[From the July/August 2008 issue: Nicholas Carr on whether Google is making us stupid]

If Carr is right, modern technology’s always-open window to the outside world makes recharging much harder, leaving many people chronically depleted, a walking battery that is always stuck in the red zone. In a healthy world, people who spend lots of time alone would feel that ancient biological cue: I’m alone and sad; I should make some plans. But we live in a sideways world, where easy home entertainment, oversharing online, and stunted social skills spark a strangely popular response: I’m alone, anxious, and exhausted; thank God my plans were canceled.

Homebound

Last year, the Princeton University sociologist Patrick Sharkey was working on a book about how places shape American lives and economic fortunes. He had a feeling that the rise of remote work might have accelerated a longer-term trend: a shift in the amount of time that people spend inside their home. He ran the numbers and discovered “an astounding change” in our daily habits, much more extreme than he would have guessed. In 2022—notably, after the pandemic had abated—adults spent an additional 99 minutes at home on any given day compared with 2003.

This finding formed the basis of a 2024 paper, “Homebound,” in which Sharkey calculated that, compared with 2003, Americans are more likely to take meetings from home, to shop from home, to be entertained at home, to eat at home, and even to worship at home. Practically the entire economy has reoriented itself to allow Americans to stay within their four walls. This phenomenon cannot be reduced to remote work. It is something far more totalizing—something more like “remote life.”

One might ask: Why wouldn’t Americans with means want to spend more time at home? In the past few decades, the typical American home has become bigger, more comfortable, and more entertaining. From 1973 to 2023, the size of the average new single-family house increased by 50 percent, and the share of new single-family houses that have air-conditioning doubled, to 98 percent. Streaming services, video-game consoles, and flatscreen TVs make the living room more diverting than any 20th-century theater or arcade. Yet conveniences can indeed be a curse. By Sharkey’s calculations, activities at home were associated with a “strong reduction” in self-reported happiness.

A homebound life doesn’t have to be a solitary life. In the 1970s, the typical household entertained more than once a month. But from the late 1970s to the late 1990s, the frequency of hosting friends for parties, games, dinners, and so on declined by 45 percent, according to data that Robert Putnam gathered. In the 20 years after Bowling Alone was published, the average amount of time that Americans spent hosting or attending social events declined another 32 percent.

As our homes have become less social, residential architecture has become more anti-social. Clifton Harness is a co-founder of TestFit, a firm that makes software to design layouts for new housing developments. He told me that the cardinal rule of contemporary apartment design is that every room is built to accommodate maximal screen time. “In design meetings with developers and architects, you have to assure everybody that there will be space for a wall-mounted flatscreen television in every room,” he said. “It used to be ‘Let’s make sure our rooms have great light.’ But now, when the question is ‘How do we give the most comfort to the most people?,’ the answer is to feed their screen addiction.” Bobby Fijan, a real-estate developer, said last year that “for the most part, apartments are built for Netflix and chill.” From studying floor plans, he noticed that bedrooms, walk-in closets, and other private spaces are growing. “I think we’re building for aloneness,” Fijan told me.

“Secular Monks”

In 2020, the philosopher and writer Andrew Taggart observed in an essay published in the religious journal First Things that a new flavor of masculinity seemed to be emerging: strong, obsessed with personal optimization, and proudly alone. Men and women alike have been delaying family formation; the median age at first marriage for men recently surpassed 30 for the first time in history. Taggart wrote that the men he knew seemed to be forgoing marriage and fatherhood with gusto. Instead of focusing their 30s and 40s on wedding bands and diapers, they were committed to working on their body, their bank account, and their meditation-sharpened minds. Taggart called these men “secular monks” for their combination of old-fashioned austerity and modern solipsism. “Practitioners submit themselves to ever more rigorous, monitored forms of ascetic self-control,” he wrote, “among them, cold showers, intermittent fasting, data-driven health optimization, and meditation boot camps.”

When I read Taggart’s essay last year, I felt a shock of recognition. In the previous months, I’d been captivated by a particular genre of social media: the viral “morning routine” video. If the protagonist is a man, he is typically handsome and rich. We see him wake up. We see him meditate. We see him write in his journal. We see him exercise, take supplements, take a cold plunge. What is most striking about these videos, however, is the element they typically lack: other people. In these little movies of a life well spent, the protagonists generally wake up alone and stay that way. We usually see no friends, no spouse, no children. These videos are advertisements for a luxurious form of modern monasticism that treats the presence of other people as, at best, an unwelcome distraction and, at worst, an unhealthy indulgence that is ideally avoided—like porn, perhaps, or Pop-Tarts.

[Read: The agony of texting with men]

Drawing major conclusions about modern masculinity from a handful of TikToks would be unwise. But the solitary man is not just a social-media phenomenon. Men spend more time alone than women, and young men are increasing their alone time faster than any other group, according to the American Time Use Survey.

Max Guther

Where is this alone time coming from? Liana C. Sayer, a sociologist at the University of Maryland, shared with me her analysis of how leisure time in the 21st century has changed for men and women. Sayer divided leisure into two broad categories: “engaged leisure,” which includes socializing, going to concerts, and playing sports; and “sedentary leisure,” which includes watching TV and playing video games. Compared with engaged leisure, which is more likely to be done with other people, sedentary leisure is more commonly done alone.

The most dramatic tendency that Sayer uncovered is that single men without kids—who have the most leisure time—are overwhelmingly likely to spend these hours by themselves. And the time they spend in solo sedentary leisure has increased, since 2003, more than that of any other group Sayer tracked. This is unfortunate because, as Sayer wrote, “well-being is higher among adults who spend larger shares of leisure with others.” Sedentary leisure, by contrast, was “associated with negative physical and mental health.”

Richard V. Reeves, the president of the American Institute for Boys and Men, told me that for men, as for women, something hard to define is lost when we pursue a life of isolationist comforts. He calls it “neededness”—the way we make ourselves essential to our families and community. “I think at some level, we all need to feel like we’re a jigsaw piece that’s going to fit into a jigsaw somewhere,” he said. This neededness can come in several forms: social, economic, or communitarian. Our children and partners can depend on us for care or income. Our colleagues can rely on us to finish a project, or to commiserate about an annoying boss. Our religious congregations and weekend poker parties can count on us to fill a pew or bring the dip.

But building these bridges to community takes energy, and today’s young men do not seem to be constructing these relationships in the same way that they used to. In place of neededness, despair is creeping in. Men who are un- or underemployed are especially vulnerable. Feeling unneeded “is actually, in some cases, literally fatal,” Reeves said. “If you look at the words that men use to describe themselves before they take their own lives, they are worthless and useless.” Since 2001, hundreds of thousands of men have died of drug overdoses, mostly from opioids and synthetics such as fentanyl. “If the level of drug-poisoning deaths had remained flat since 2001, we’d have had 400,000 fewer men die,” Reeves said. These drugs, he emphasized, are defined by their solitary nature: Opioids are not party drugs, but rather the opposite.

This Is Your Politics on Solitude

All of this time alone, at home, on the phone, is not just affecting us as individuals. It’s making society weaker, meaner, and more delusional. Marc J. Dunkelman, an author and a research fellow at Brown University, says that to see how chosen solitude is warping society at large, we must first acknowledge something a little counterintuitive: Today, many of our bonds are actually getting stronger.

Parents are spending more time with their children than they did several decades ago, and many couples and families maintain an unbroken flow of communication. “My wife and I have texted 10 times since we said goodbye today,” Dunkelman told me when I reached him at noon on a weekday. “When my 10-year-old daughter buys a Butterfinger at CVS, I get a phone notification about it.”

At the same time, messaging apps, TikTok streams, and subreddits keep us plugged into the thoughts and opinions of the global crowd that shares our interests. “When I watch a Cincinnati Bengals football game, I’m on a group text with beat reporters to whom I can ask questions, and they’ll respond,” Dunkelman said. “I can follow the live thoughts of football analysts on X.com, so that I’m practically watching the game over their shoulder. I live in Rhode Island, and those are connections that could have never existed 30 years ago.”

Home-based, phone-based culture has arguably solidified our closest and most distant connections, the inner ring of family and best friends (bound by blood and intimacy) and the outer ring of tribe (linked by shared affinities). But it’s wreaking havoc on the middle ring of “familiar but not intimate” relationships with the people who live around us, which Dunkelman calls the village. “These are your neighbors, the people in your town,” he said. We used to know them well; now we don’t.

The middle ring is key to social cohesion, Dunkelman said. Families teach us love, and tribes teach us loyalty. The village teaches us tolerance. Imagine that a local parent disagrees with you about affirmative action at a PTA meeting. Online, you might write him off as a political opponent who deserves your scorn. But in a school gym full of neighbors, you bite your tongue. As the year rolls on, you discover that your daughters are in the same dance class. At pickup, you swap stories about caring for aging relatives. Although your differences don’t disappear, they’re folded into a peaceful coexistence. And when the two of you sign up for a committee to draft a diversity statement for the school, you find that you can accommodate each other’s opposing views. “It’s politically moderating to meet thoughtful people in the real world who disagree with you,” Dunkelman said. But if PTA meetings are still frequently held in person, many other opportunities to meet and understand one’s neighbors are becoming a thing of the past. “An important implication of the death of the middle ring is that if you have no appreciation for why the other side has their narrative, you’ll want your own side to fight them without compromise.”

The village is our best arena for practicing productive disagreement and compromise—in other words, democracy. So it’s no surprise that the erosion of the village has coincided with the emergence of a grotesque style of politics, in which every election feels like an existential quest to vanquish an intramural enemy. For the past five decades, the American National Election Studies surveys have asked Democrats and Republicans to rate the opposing party on a “Feeling Thermometer” that ranges from zero (very cold/unfavorable) to 100 (very warm/favorable). In 2000, just 8 percent of partisans gave the other party a zero. By 2020, that figure had shot up to 40 percent. In a 2021 poll by Generation Lab/Axios, nearly a third of college students who identify as Republican said they wouldn’t even go on a date with a Democrat, and more than two-thirds of Democratic students said the same of members of the GOP.

Donald Trump’s victory in the 2024 presidential election had many causes, including inflation and frustration with Joe Biden’s leadership. But one source of Trump’s success may be that he is an avatar of the all-tribe, no-village style of performative confrontation. He stokes out-group animosity, and speaks to voters who are furiously intolerant of political difference. To cite just a few examples from the campaign, Trump called Democrats “enemies of the democracy” and the news media “enemies of the people,” and promised to “root out” the “radical-left thugs that live like vermin within the confines of our country, that lie and steal and cheat on elections.”

Max Guther

Social disconnection also helps explain progressives’ stubborn inability to understand Trump’s appeal. In the fall, one popular Democratic lawn sign read Harris Walz: Obviously. That sentiment, rejected by a majority of voters, indicates a failure to engage with the world as it really is. Dunkelman emailed me after the election to lament Democratic cluelessness. “How did those of us who live in elite circles not see how Trump was gaining popularity even among our literal neighbors?” he wrote. Too many progressives were mainlining left-wing media in the privacy of their home, oblivious that families down the street were drifting right. Even in the highly progressive borough of Brooklyn, New York, three in 10 voters chose Trump. If progressives still consider MAGA an alien movement, it is in part because they have made themselves strangers in their own land.

Practicing politics alone, on the internet, rather than in community isn’t only making us more likely to demonize and alienate our opponents, though that would be bad enough. It may also be encouraging deep nihilism. In 2018, a group of researchers led by Michael Bang Petersen, a Danish political scientist, began asking Americans to evaluate false rumors about Democratic and Republican politicians, including Trump and Hillary Clinton. “We were expecting a clear pattern of polarization,” Petersen told me, with people on the left sharing conspiracies about the right and vice versa. But some participants seemed drawn to any conspiracy theory so long as it was intended to destroy the established order. Members of this cohort commonly harbored racial or economic grievances. Perhaps more important, Petersen said, they tended to feel socially isolated. These aggravated loners agreed with many dark pronouncements, such as “I need chaos around me” and “When I think about our political and social institutions, I cannot help thinking ‘just let them all burn.’ ” Petersen and his colleagues coined a term to describe this cohort’s motivation: the need for chaos.

[Read: Derek Thompson on the Americans who need chaos]

Although chaotically inclined individuals score highly in a popular measure for loneliness, they don’t seem to seek the obvious remedy. “What they’re reaching out to get isn’t friendship at all but rather recognition and status,” Petersen said. For many socially isolated men in particular, for whom reality consists primarily of glowing screens in empty rooms, a vote for destruction is a politics of last resort—a way to leave one’s mark on a world where collective progress, or collective support of any kind, feels impossible.

The Introversion Delusion

Let us be fair to solitude, for a moment. As the father of a young child, I know well that a quiet night alone can be a balm. I have spent evenings alone at a bar, watching a baseball game, that felt ecstatically close to heaven. People cope with stress and grief and mundane disappointment in complex ways, and sometimes isolation is the best way to restore inner equilibrium.

But the dosage matters. A night alone away from a crying baby is one thing. A decade or more of chronic social disconnection is something else entirely. And people who spend more time alone, year after year, become meaningfully less happy. In his 2023 paper on the rise of 21st-century solitude, Atalay, at the Philadelphia Fed, calculated that by one measure, sociability means considerably more for happiness than money does: A five-percentage-point increase in alone time was associated with about the same decline in life satisfaction as was a 10 percent lower household income.

Max Guther

Nonetheless, many people keep choosing to spend free time alone, in their home, away from other people. Perhaps, one might think, they are making the right choice; after all, they must know themselves best. But a consistent finding of modern psychology is that people often don’t know what they want, or what will make them happy. The saying that “predictions are hard, especially about the future” applies with special weight to predictions about our own life. Time and again, what we expect to bring us peace—a bigger house, a luxury car, a job with twice the pay but half the leisure—only creates more anxiety. And at the top of this pile of things we mistakenly believe we want, there is aloneness.

[From the May 2012 issue: Is Facebook making us lonely?]

Several years ago, Nick Epley, a psychologist at the University of Chicago’s Booth School of Business, asked commuter-train passengers to make a prediction: How would they feel if asked to spend the ride talking with a stranger? Most participants predicted that quiet solitude would make for a better commute than having a long chat with someone they didn’t know. Then Epley’s team created an experiment in which some people were asked to keep to themselves, while others were instructed to talk with a stranger (“The longer the conversation, the better,” participants were told). Afterward, people filled out a questionnaire. How did they feel? Despite the broad assumption that the best commute is a silent one, the people instructed to talk with strangers actually reported feeling significantly more positive than those who’d kept to themselves. “A fundamental paradox at the core of human life is that we are highly social and made better in every way by being around people,” Epley said. “And yet over and over, we have opportunities to connect that we don’t take, or even actively reject, and it is a terrible mistake.”

Researchers have repeatedly validated Epley’s discovery. In 2020, the psychologists Seth Margolis and Sonja Lyubomirsky, at UC Riverside, asked people to behave like an extrovert for one week and like an introvert for another. Subjects received several reminders to act “assertive” and “spontaneous” or “quiet” and “reserved” depending on the week’s theme. Participants said they felt more positive emotions at the end of the extroversion week and more negative emotions at the end of the introversion week. Our modern economy, with its home-delivery conveniences, manipulates people into behaving like agoraphobes. But it turns out that we can be manipulated in the opposite direction. And we might be happier for it.

Our “mistaken” preference for solitude could emerge from a misplaced anxiety that other people aren’t that interested in talking with us, or that they would find our company bothersome. “But in reality,” Epley told me, “social interaction is not very uncertain, because of the principle of reciprocity. If you say hello to someone, they’ll typically say hello back to you. If you give somebody a compliment, they’ll typically say thank you.” Many people, it seems, are not social enough for their own good. They too often seek comfort in solitude, when they would actually find joy in connection.

Despite a consumer economy that seems optimized for introverted behavior, we would have happier days, years, and lives if we resisted the undertow of the convenience curse—if we talked with more strangers, belonged to more groups, and left the house for more activities.

The AI Century

The anti-social century has been bad enough: more anxiety and depression; more “need for chaos” in our politics. But I’m sorry to say that our collective detachment could still get worse. Or, to be more precise, weirder.

In May of last year, three employees of OpenAI, the artificial-intelligence company, sat onstage to introduce ChatGPT’s new real-time conversational-speech feature. A research scientist named Mark Chen held up a phone and, smiling, started speaking to it.

“Hey, ChatGPT, I’m Mark. How are you?” Mark said.

“Hello, Mark!” a cheery female voice responded.

“Hey, so I’m onstage right now,” Mark said. “I’m doing a live demo, and frankly I’m feeling a little bit nervous. Can you help me calm my nerves a little bit?”

“Oh, you’re doing a live demo right now?” the voice replied, projecting astonishment with eerie verisimilitude. “That’s awesome! Just take a deep breath and remember: You’re the expert here.”

Mark asked for feedback on his breathing, before panting loudly, like someone who’d just finished a marathon.

“Whoa, slow!” the voice responded. “Mark, you’re not a vacuum cleaner!” Out of frame, the audience laughed. Mark tried breathing audibly again, this time more slowly and deliberately.

“That’s it,” the AI responded. “How do you feel?”

“I feel a lot better,” Mark said. “Thank you so much.”

AI’s ability to speak naturally might seem like an incremental update, as subtle as a camera-lens refinement on a new iPhone. But according to Nick Epley, fluent speech represents a radical advancement in the technology’s ability to encroach on human relationships.

“Once an AI can speak to you, it’ll feel extremely real,” he said, because people process spoken word more intimately and emotionally than they process text. For a study published in 2020, Epley and Amit Kumar, a psychologist at the University of Texas at Austin, randomly assigned participants to contact an old friend via phone or email. Most people said they preferred to send a written message. But those instructed to talk on the phone reported feeling “a significantly stronger bond” with their friend, and a stronger sense that they’d “really connected,” than those who used email.

Speech is rich with what are known as “paralinguistic cues,” such as emphasis and intonation, which can build sympathy and trust in the minds of listeners. In another study, Epley and the behavioral scientist Juliana Schroeder found that employers and potential recruiters were more likely to rate candidates as “more competent, thoughtful, and intelligent” when they heard a why-I’m-right-for-this-job pitch rather than read it.

Even now, before AI has mastered fluent speech, millions of people are already forming intimate relationships with machines, according to Jason Fagone, a journalist who is writing a book about the emergence of AI companions. Character.ai, the most popular platform for AI companions, has tens of millions of monthly users, who spend an average of 93 minutes a day chatting with their AI friend. “No one is getting duped into thinking they’re actually talking to humans,” Fagone told me. “People are freely choosing to enter relationships with artificial partners, and they’re getting deeply attached anyway, because of the emotional capabilities of these systems.” One subject in his book is a young man who, after his fiancée’s death, engineers an AI chatbot to resemble his deceased partner. Another is a bisexual mother who supplements her marriage to a man with an AI that identifies as a woman.

If you find the notion of emotional intercourse with an immaterial entity creepy, consider the many friends and family members who exist in your life mainly as words on a screen. Digital communication has already prepared us for AI companionship, Fagone said, by transforming many of our physical-world relationships into a sequence of text chimes and blue bubbles. “I think part of why AI-companion apps have proven so seductive so quickly is that most of our relationships already happen exclusively through the phone,” he said.

Epley sees the exponential growth of AI companions as a real possibility. “You can set them up to never criticize you, never cheat on you, never have a bad day and insult you, and to always be interested in you.” Unlike the most patient spouses, they could tell us that we’re always right. Unlike the world’s best friend, they could instantly respond to our needs without the all-too-human distraction of having to lead their own life.

“The horrifying part, of course, is that learning how to interact with real human beings who can disagree with you and disappoint you” is essential to living in the world, Epley said. I think he’s right. But Epley was born in the 1970s. I was born in the 1980s. People born in the 2010s, or the 2020s, might not agree with us about the irreplaceability of “real human” friends. These generations may discover that what they want most from their relationships is not a set of people, who might challenge them, but rather a set of feelings—sympathy, humor, validation—that can be more reliably drawn out from silicon than from carbon-based life forms. Long before technologists build a superintelligent machine that can do the work of so many Einsteins, they may build an emotionally sophisticated one that can do the work of so many friends.

The Next 15 Minutes

The anti-social century is as much a result of what’s happened to the exterior world of concrete and steel as it is about advances inside our phones. The decline of government investments in what Eric Klinenberg calls “social infrastructure”—public spaces that shape our relationship to the world—may have begun in the latter part of the 20th century, but it has continued in the 21st. That has arguably affected nearly everyone, but less advantaged Americans most of all.

“I can’t tell you how many times I’ve gone to poor neighborhoods in big cities, and the community leaders tell me the real crisis for poor teenagers is that there’s just not much for them to do anymore, and nowhere to go,” Klinenberg told me. “I’d like to see the government build social infrastructure for teenagers with the creativity and generosity with which video-game companies build the toys that keep them inside. I’m thinking of athletic fields, and public swimming pools, and libraries with beautiful social areas for young people to hang out together.”

Improved public social infrastructure would not solve all the problems of the anti-social century. But degraded public spaces—and degraded public life—are in some ways the other side of all our investments in video games and phones and bigger, better private space. Just as we needed time to see the invisible emissions of the Industrial Revolution, we are only now coming to grips with the negative externalities of a phonebound and homebound world. The media theorist Marshall McLuhan once said of technology that every augmentation is also an amputation. We chose our digitally enhanced world. We did not realize the significance of what was being amputated.

Max Guther

But we can choose differently. In his 2015 novel, Seveneves, Neal Stephenson coined the term Amistics to describe the practice of carefully selecting which technologies to accept. The word is a reference to the Amish, who generally shun many modern innovations, including cars and television. Although they are sometimes considered strictly anti-modern, many Amish communities have refrigerators and washing machines, and some use solar power. Instead of dismissing all technology, the Amish adopt only those innovations that support their religious and communal values. In his 1998 dissertation on one Amish community, Tay Keong Tan, then a Ph.D. candidate at Harvard, quoted a community member as saying that they didn’t want to adopt TV or radio, because those products “would destroy our visiting practices. We would stay at home with the television or radio rather than meet with other people.”

If the Amish approach to technology is radical in its application, it recognizes something plain and true: Although technology does not have values of its own, its adoption can create values, even in the absence of a coordinated effort. For decades, we’ve adopted whatever technologies removed friction or increased dopamine, embracing what makes life feel easy and good in the moment. But dopamine is a chemical, not a virtue. And what’s easy is not always what’s best for us. We should ask ourselves: What would it mean to select technology based on long-term health rather than instant gratification? And if technology is hurting our community, what can we do to heal it?

A seemingly straightforward prescription is that teenagers should choose to spend less time on their phone, and their parents should choose to invite more friends over for dinner. But in a way, these are collective-action problems. A teenager is more likely to get out of the house if his classmates have already made a habit of hanging out. That teen’s parents are more likely to host if their neighbors have also made a habit of weekly gatherings. There is a word for such deeply etched communal habits: rituals. And one reason, perhaps, that the decline of socializing has synchronized with the decline of religion is that nothing has proved as adept at inscribing ritual into our calendars as faith.

“I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school.

When Epley and his lab asked Chicagoans to overcome their preference for solitude and talk with strangers on a train, the experiment probably didn’t change anyone’s life. All it did was marginally improve the experience of one 15-minute block of time. But life is just a long set of 15-minute blocks, one after another. The way we spend our minutes is the way we spend our decades. “No amount of research that I’ve done has changed my life more than this,” Epley told me. “It’s not that I’m never lonely. It’s that my moment-to-moment experience of life is better, because I’ve learned to take the dead space of life and make friends in it.”

This article appears in the February 2025 print edition with the headline “The Anti-Social Century.”