Itemoids

No

The Anti-Social Century

The Atlantic

www.theatlantic.com › magazine › archive › 2025 › 02 › american-loneliness-personality-politics › 681091

This story seems to be about:

Illustrations by Max Guther

This article was featured in the One Story to Read Today newsletter. Sign up for it here.

The Bar Is Closed

A short drive from my home in North Carolina is a small Mexican restaurant, with several tables and four stools at a bar facing the kitchen. On a sweltering afternoon last summer, I walked in with my wife and daughter. The place was empty. But looking closer, I realized that business was booming. The bar was covered with to-go food: nine large brown bags.

As we ate our meal, I watched half a dozen people enter the restaurant without sitting down to eat. Each one pushed open the door, walked to the counter, picked up a bag from the bar, and left. In the delicate choreography between kitchen and customer, not a word was exchanged. The space once reserved for that most garrulous social encounter, the bar hangout, had been reconfigured into a silent depot for customers to grab food to eat at home.

Until the pandemic, the bar was bustling and popular with regulars. “It’s just a few seats, but it was a pretty happening place,” Rae Mosher, the restaurant’s general manager, told me. “I can’t tell you how sad I’ve been about it,” she went on. “I know it hinders communications between customers and staff to have to-go bags taking up the whole bar. But there’s nowhere else for the food to go.” She put up a sign: BAR SEATING CLOSED.

The sign on the bar is a sign of the times for the restaurant business. In the past few decades, the sector has shifted from tables to takeaway, a process that accelerated through the pandemic and continued even as the health emergency abated. In 2023, 74 percent of all restaurant traffic came from “off premises” customers—that is, from takeout and delivery—up from 61 percent before COVID, according to the National Restaurant Association.

The flip side of less dining out is more eating alone. The share of U.S. adults having dinner or drinks with friends on any given night has declined by more than 30 percent in the past 20 years. “There’s an isolationist dynamic that’s taking place in the restaurant business,” the Washington, D.C., restaurateur Steve Salis told me. “I think people feel uncomfortable in the world today. They’ve decided that their home is their sanctuary. It’s not easy to get them to leave.” Even when Americans eat at restaurants, they are much more likely to do so by themselves. According to data gathered by the online reservations platform OpenTable, solo dining has increased by 29 percent in just the past two years. The No. 1 reason is the need for more “me time.”

The evolution of restaurants is retracing the trajectory of another American industry: Hollywood. In the 1930s, video entertainment existed only in theaters, and the typical American went to the movies several times a month. Film was a necessarily collective experience, something enjoyed with friends and in the company of strangers. But technology has turned film into a home delivery system. Today, the typical American adult buys about three movie tickets a year—and watches almost 19 hours of television, the equivalent of roughly eight movies, on a weekly basis. In entertainment, as in dining, modernity has transformed a ritual of togetherness into an experience of homebound reclusion and even solitude.

The privatization of American leisure is one part of a much bigger story. Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965. Between that year and the end of the 20th century, in-person socializing slowly declined. From 2003 to 2023, it plunged by more than 20 percent, according to the American Time Use Survey, an annual study conducted by the Bureau of Labor Statistics. Among unmarried men and people younger than 25, the decline was more than 35 percent. Alone time predictably spiked during the pandemic. But the trend had started long before most people had ever heard of a novel coronavirus and continued after the pandemic was declared over. According to Enghin Atalay, an economist at the Federal Reserve Bank of Philadelphia, Americans spent even more time alone in 2023 than they did in 2021. (He categorized a person as “alone,” as I will throughout this article, if they are “the only person in the room, even if they are on the phone” or in front of a computer.)

Eroding companionship can be seen in numerous odd and depressing facts of American life today. Men who watch television now spend seven hours in front of the TV for every hour they spend hanging out with somebody outside their home. The typical female pet owner spends more time actively engaged with her pet than she spends in face-to-face contact with friends of her own species. Since the early 2000s, the amount of time that Americans say they spend helping or caring for people outside their nuclear family has declined by more than a third.

[Derek Thompson: Why Americans suddenly stopped hanging out]

Self-imposed solitude might just be the most important social fact of the 21st century in America. Perhaps unsurprisingly, many observers have reduced this phenomenon to the topic of loneliness. In 2023, Vivek Murthy, Joe Biden’s surgeon general, published an 81-page warning about America’s “epidemic of loneliness,” claiming that its negative health effects were on par with those of tobacco use and obesity. A growing number of public-health officials seem to regard loneliness as the developed world’s next critical public-health issue. The United Kingdom now has a minister for loneliness. So does Japan.

Max Guther

But solitude and loneliness are not one and the same. “It is actually a very healthy emotional response to feel some loneliness,” the NYU sociologist Eric Klinenberg told me. “That cue is the thing that pushes you off the couch and into face-to-face interaction.” The real problem here, the nature of America’s social crisis, is that most Americans don’t seem to be reacting to the biological cue to spend more time with other people. Their solitude levels are surging while many measures of loneliness are actually flat or dropping. A 2021 study of the widely used UCLA Loneliness Scale concluded that “the frequently used term ‘loneliness epidemic’ seems exaggerated.” Although young people are lonelier than they once were, there is little evidence that loneliness is rising more broadly today. A 2023 Gallup survey found that the share of Americans who said they experienced loneliness “a lot of the day yesterday” declined by roughly one-third from 2021 to 2023, even as alone time, by Atalay’s calculation, rose slightly.

Day to day, hour to hour, we are choosing this way of life—its comforts, its ready entertainments. But convenience can be a curse. Our habits are creating what Atalay has called a “century of solitude.” This is the anti-social century.

Over the past few months, I’ve spoken with psychologists, political scientists, sociologists, and technologists about America’s anti-social streak. Although the particulars of these conversations differed, a theme emerged: The individual preference for solitude, scaled up across society and exercised repeatedly over time, is rewiring America’s civic and psychic identity. And the consequences are far-reaching—for our happiness, our communities, our politics, and even our understanding of reality.

The End of the Social Century

The first half of the 20th century was extraordinarily social. From 1900 to 1960, church membership surged, as did labor-union participation. Marriage rates reached a record high after World War II, and the birth rate enjoyed a famous “boom.” Associations of all sorts thrived, including book clubs and volunteer groups. The New Deal made America’s branch-library system the envy of the world; communities and developers across the country built theaters, music venues, playgrounds, and all kinds of gathering places.

But in the 1970s, the U.S. entered an era of withdrawal, as the political scientist Robert D. Putnam famously documented in his 2000 book, Bowling Alone. Some institutions of togetherness, such as marriage, eroded slowly. Others fell away swiftly. From 1985 to 1994, active involvement in community organizations fell by nearly half. The decline was astonishingly broad, affecting just about every social activity and every demographic group that Putnam tracked.

What happened in the 1970s? Klinenberg, the sociologist, notes a shift in political priorities: The government dramatically slowed its construction of public spaces. “Places that used to anchor community life, like libraries and school gyms and union halls, have become less accessible or shuttered altogether,” he told me. Putnam points, among other things, to new moral values, such as the embrace of unbridled individualism. But he found that two of the most important factors were by then ubiquitous technologies: the automobile and the television set.

Starting in the second half of the century, Americans used their cars to move farther and farther away from one another, enabling the growth of the suburbs and, with it, a retreat into private backyard patios, private pools, a more private life. Once Americans got out of the car, they planted themselves in front of the television. From 1965 to 1995, the typical adult gained six hours a week in leisure time. They could have devoted that time—300 hours a year!—to community service, or pickup basketball, or reading, or knitting, or all four. Instead, they funneled almost all of this extra time into watching more TV.

Television transformed Americans’ interior decorating, our relationships, and our communities. In 1970, just 6 percent of sixth graders had a TV set in their bedroom; in 1999, that proportion had grown to 77 percent. Time diaries in the 1990s showed that husbands and wives spent almost four times as many hours watching TV together as they spent talking to each other in a given week. People who said TV was their “primary form of entertainment” were less likely to engage in practically every social activity that Putnam counted: volunteering, churchgoing, attending dinner parties, picnicking, giving blood, even sending greeting cards. Like a murder in Clue, the death of social connections in America had any number of suspects. But in the end, I believe the likeliest culprit is obvious. It was Mr. Farnsworth, in the living room, with the tube.

Phonebound

If two of the 20th century’s iconic technologies, the automobile and the television, initiated the rise of American aloneness, the 21st century’s most notorious piece of hardware has continued to fuel, and has indeed accelerated, our national anti-social streak. Countless books, articles, and cable-news segments have warned Americans that smartphones can negatively affect mental health and may be especially harmful to adolescents. But the fretful coverage is, if anything, restrained given how greatly these devices have changed our conscious experience. The typical person is awake for about 900 minutes a day. American kids and teenagers spend, on average, about 270 minutes on weekdays and 380 minutes on weekends gazing into their screens, according to the Digital Parenthood Initiative. By this account, screens occupy more than 30 percent of their waking life.

Some of this screen time is social, after a fashion. But sharing videos or texting friends is a pale imitation of face-to-face interaction. More worrisome than what young people do on their phone is what they aren’t doing. Young people are less likely than in previous decades to get their driver’s license, or to go on a date, or to have more than one close friend, or even to hang out with their friends at all. The share of boys and girls who say they meet up with friends almost daily outside school hours has declined by nearly 50 percent since the early 1990s, with the sharpest downturn occurring in the 2010s.

Max Guther

The decline of hanging out can’t be shrugged off as a benign generational change, something akin to a preference for bell-bottoms over skinny jeans. Human childhood—including adolescence—is a uniquely sensitive period in the whole of the animal kingdom, the psychologist Jonathan Haidt writes in The Anxious Generation. Although the human brain grows to 90 percent of its full size by age 5, its neural circuitry takes a long time to mature. Our lengthy childhood might be evolution’s way of scheduling an extended apprenticeship in social learning through play. The best kind of play is physical, outdoors, with other kids, and unsupervised, allowing children to press the limits of their abilities while figuring out how to manage conflict and tolerate pain. But now young people’s attention is funneled into devices that take them out of their body, denying them the physical-world education they need.

[Read: Jonathan Haidt on the terrible costs of a phone-based childhood]

Teen anxiety and depression are at near-record highs: The latest government survey of high schoolers, conducted in 2023, found that more than half of teen girls said they felt “persistently sad or hopeless.” These data are alarming, but shouldn’t be surprising. Young rats and monkeys deprived of play come away socially and emotionally impaired. It would be odd if we, the self-named “social animal,” were different.

Socially underdeveloped childhood leads, almost inexorably, to socially stunted adulthood. A popular trend on TikTok involves 20‑somethings celebrating in creative ways when a friend cancels plans, often because they’re too tired or anxious to leave the house. These clips can be goofy and even quite funny. Surely, sympathy is due; we all know the feeling of relief when we claw back free time in an overscheduled week. But the sheer number of videos is a bit unsettling. If anybody should feel lonely and desperate for physical-world contact, you’d think it would be 20-somethings, who are still recovering from years of pandemic cabin fever. But many nights, it seems, members of America’s most isolated generation aren’t trying to leave the house at all. They’re turning on their cameras to advertise to the world the joy of not hanging out.

If young adults feel overwhelmed by the emotional costs of physical-world togetherness—and prone to keeping even close friends at a physical distance—that suggests that phones aren’t just rewiring adolescence; they’re upending the psychology of friendship as well.

[From the September 2017 issue: Have smartphones destroyed a generation?]

In the 1960s, Irwin Altman, a psychologist at the Naval Medical Research Institute, in Bethesda, Maryland, co-developed a friendship formula characterized by increasing intimacy. In the early stages of friendship, people engage in small talk by sharing trivial details. As they develop trust, their conversations deepen to include more private information until disclosure becomes habitual and easy. Altman later added an important wrinkle: Friends require boundaries as much as they require closeness. Time alone to recharge is essential for maintaining healthy relationships.

Phones mean that solitude is more crowded than it used to be, and crowds are more solitary. “Bright lines once separated being alone and being in a crowd,” Nicholas Carr, the author of the new book Superbloom: How Technologies of Connection Tear Us Apart, told me. “Boundaries helped us. You could be present with your friends and reflective in your downtime.” Now our social time is haunted by the possibility that something more interesting is happening somewhere else, and our downtime is contaminated by the streams and posts and texts of dozens of friends, colleagues, frenemies, strangers.

[From the July/August 2008 issue: Nicholas Carr on whether Google is making us stupid]

If Carr is right, modern technology’s always-open window to the outside world makes recharging much harder, leaving many people chronically depleted, a walking battery that is always stuck in the red zone. In a healthy world, people who spend lots of time alone would feel that ancient biological cue: I’m alone and sad; I should make some plans. But we live in a sideways world, where easy home entertainment, oversharing online, and stunted social skills spark a strangely popular response: I’m alone, anxious, and exhausted; thank God my plans were canceled.

Homebound

Last year, the Princeton University sociologist Patrick Sharkey was working on a book about how places shape American lives and economic fortunes. He had a feeling that the rise of remote work might have accelerated a longer-term trend: a shift in the amount of time that people spend inside their home. He ran the numbers and discovered “an astounding change” in our daily habits, much more extreme than he would have guessed. In 2022—notably, after the pandemic had abated—adults spent an additional 99 minutes at home on any given day compared with 2003.

This finding formed the basis of a 2024 paper, “Homebound,” in which Sharkey calculated that, compared with 2003, Americans are more likely to take meetings from home, to shop from home, to be entertained at home, to eat at home, and even to worship at home. Practically the entire economy has reoriented itself to allow Americans to stay within their four walls. This phenomenon cannot be reduced to remote work. It is something far more totalizing—something more like “remote life.”

One might ask: Why wouldn’t Americans with means want to spend more time at home? In the past few decades, the typical American home has become bigger, more comfortable, and more entertaining. From 1973 to 2023, the size of the average new single-family house increased by 50 percent, and the share of new single-family houses that have air-conditioning doubled, to 98 percent. Streaming services, video-game consoles, and flatscreen TVs make the living room more diverting than any 20th-century theater or arcade. Yet conveniences can indeed be a curse. By Sharkey’s calculations, activities at home were associated with a “strong reduction” in self-reported happiness.

A homebound life doesn’t have to be a solitary life. In the 1970s, the typical household entertained more than once a month. But from the late 1970s to the late 1990s, the frequency of hosting friends for parties, games, dinners, and so on declined by 45 percent, according to data that Robert Putnam gathered. In the 20 years after Bowling Alone was published, the average amount of time that Americans spent hosting or attending social events declined another 32 percent.

As our homes have become less social, residential architecture has become more anti-social. Clifton Harness is a co-founder of TestFit, a firm that makes software to design layouts for new housing developments. He told me that the cardinal rule of contemporary apartment design is that every room is built to accommodate maximal screen time. “In design meetings with developers and architects, you have to assure everybody that there will be space for a wall-mounted flatscreen television in every room,” he said. “It used to be ‘Let’s make sure our rooms have great light.’ But now, when the question is ‘How do we give the most comfort to the most people?,’ the answer is to feed their screen addiction.” Bobby Fijan, a real-estate developer, said last year that “for the most part, apartments are built for Netflix and chill.” From studying floor plans, he noticed that bedrooms, walk-in closets, and other private spaces are growing. “I think we’re building for aloneness,” Fijan told me.

“Secular Monks”

In 2020, the philosopher and writer Andrew Taggart observed in an essay published in the religious journal First Things that a new flavor of masculinity seemed to be emerging: strong, obsessed with personal optimization, and proudly alone. Men and women alike have been delaying family formation; the median age at first marriage for men recently surpassed 30 for the first time in history. Taggart wrote that the men he knew seemed to be forgoing marriage and fatherhood with gusto. Instead of focusing their 30s and 40s on wedding bands and diapers, they were committed to working on their body, their bank account, and their meditation-sharpened minds. Taggart called these men “secular monks” for their combination of old-fashioned austerity and modern solipsism. “Practitioners submit themselves to ever more rigorous, monitored forms of ascetic self-control,” he wrote, “among them, cold showers, intermittent fasting, data-driven health optimization, and meditation boot camps.”

When I read Taggart’s essay last year, I felt a shock of recognition. In the previous months, I’d been captivated by a particular genre of social media: the viral “morning routine” video. If the protagonist is a man, he is typically handsome and rich. We see him wake up. We see him meditate. We see him write in his journal. We see him exercise, take supplements, take a cold plunge. What is most striking about these videos, however, is the element they typically lack: other people. In these little movies of a life well spent, the protagonists generally wake up alone and stay that way. We usually see no friends, no spouse, no children. These videos are advertisements for a luxurious form of modern monasticism that treats the presence of other people as, at best, an unwelcome distraction and, at worst, an unhealthy indulgence that is ideally avoided—like porn, perhaps, or Pop-Tarts.

[Read: The agony of texting with men]

Drawing major conclusions about modern masculinity from a handful of TikToks would be unwise. But the solitary man is not just a social-media phenomenon. Men spend more time alone than women, and young men are increasing their alone time faster than any other group, according to the American Time Use Survey.

Max Guther

Where is this alone time coming from? Liana C. Sayer, a sociologist at the University of Maryland, shared with me her analysis of how leisure time in the 21st century has changed for men and women. Sayer divided leisure into two broad categories: “engaged leisure,” which includes socializing, going to concerts, and playing sports; and “sedentary leisure,” which includes watching TV and playing video games. Compared with engaged leisure, which is more likely to be done with other people, sedentary leisure is more commonly done alone.

The most dramatic tendency that Sayer uncovered is that single men without kids—who have the most leisure time—are overwhelmingly likely to spend these hours by themselves. And the time they spend in solo sedentary leisure has increased, since 2003, more than that of any other group Sayer tracked. This is unfortunate because, as Sayer wrote, “well-being is higher among adults who spend larger shares of leisure with others.” Sedentary leisure, by contrast, was “associated with negative physical and mental health.”

Richard V. Reeves, the president of the American Institute for Boys and Men, told me that for men, as for women, something hard to define is lost when we pursue a life of isolationist comforts. He calls it “neededness”—the way we make ourselves essential to our families and community. “I think at some level, we all need to feel like we’re a jigsaw piece that’s going to fit into a jigsaw somewhere,” he said. This neededness can come in several forms: social, economic, or communitarian. Our children and partners can depend on us for care or income. Our colleagues can rely on us to finish a project, or to commiserate about an annoying boss. Our religious congregations and weekend poker parties can count on us to fill a pew or bring the dip.

But building these bridges to community takes energy, and today’s young men do not seem to be constructing these relationships in the same way that they used to. In place of neededness, despair is creeping in. Men who are un- or underemployed are especially vulnerable. Feeling unneeded “is actually, in some cases, literally fatal,” Reeves said. “If you look at the words that men use to describe themselves before they take their own lives, they are worthless and useless.” Since 2001, hundreds of thousands of men have died of drug overdoses, mostly from opioids and synthetics such as fentanyl. “If the level of drug-poisoning deaths had remained flat since 2001, we’d have had 400,000 fewer men die,” Reeves said. These drugs, he emphasized, are defined by their solitary nature: Opioids are not party drugs, but rather the opposite.

This Is Your Politics on Solitude

All of this time alone, at home, on the phone, is not just affecting us as individuals. It’s making society weaker, meaner, and more delusional. Marc J. Dunkelman, an author and a research fellow at Brown University, says that to see how chosen solitude is warping society at large, we must first acknowledge something a little counterintuitive: Today, many of our bonds are actually getting stronger.

Parents are spending more time with their children than they did several decades ago, and many couples and families maintain an unbroken flow of communication. “My wife and I have texted 10 times since we said goodbye today,” Dunkelman told me when I reached him at noon on a weekday. “When my 10-year-old daughter buys a Butterfinger at CVS, I get a phone notification about it.”

At the same time, messaging apps, TikTok streams, and subreddits keep us plugged into the thoughts and opinions of the global crowd that shares our interests. “When I watch a Cincinnati Bengals football game, I’m on a group text with beat reporters to whom I can ask questions, and they’ll respond,” Dunkelman said. “I can follow the live thoughts of football analysts on X.com, so that I’m practically watching the game over their shoulder. I live in Rhode Island, and those are connections that could have never existed 30 years ago.”

Home-based, phone-based culture has arguably solidified our closest and most distant connections, the inner ring of family and best friends (bound by blood and intimacy) and the outer ring of tribe (linked by shared affinities). But it’s wreaking havoc on the middle ring of “familiar but not intimate” relationships with the people who live around us, which Dunkelman calls the village. “These are your neighbors, the people in your town,” he said. We used to know them well; now we don’t.

The middle ring is key to social cohesion, Dunkelman said. Families teach us love, and tribes teach us loyalty. The village teaches us tolerance. Imagine that a local parent disagrees with you about affirmative action at a PTA meeting. Online, you might write him off as a political opponent who deserves your scorn. But in a school gym full of neighbors, you bite your tongue. As the year rolls on, you discover that your daughters are in the same dance class. At pickup, you swap stories about caring for aging relatives. Although your differences don’t disappear, they’re folded into a peaceful coexistence. And when the two of you sign up for a committee to draft a diversity statement for the school, you find that you can accommodate each other’s opposing views. “It’s politically moderating to meet thoughtful people in the real world who disagree with you,” Dunkelman said. But if PTA meetings are still frequently held in person, many other opportunities to meet and understand one’s neighbors are becoming a thing of the past. “An important implication of the death of the middle ring is that if you have no appreciation for why the other side has their narrative, you’ll want your own side to fight them without compromise.”

The village is our best arena for practicing productive disagreement and compromise—in other words, democracy. So it’s no surprise that the erosion of the village has coincided with the emergence of a grotesque style of politics, in which every election feels like an existential quest to vanquish an intramural enemy. For the past five decades, the American National Election Studies surveys have asked Democrats and Republicans to rate the opposing party on a “Feeling Thermometer” that ranges from zero (very cold/unfavorable) to 100 (very warm/favorable). In 2000, just 8 percent of partisans gave the other party a zero. By 2020, that figure had shot up to 40 percent. In a 2021 poll by Generation Lab/Axios, nearly a third of college students who identify as Republican said they wouldn’t even go on a date with a Democrat, and more than two-thirds of Democratic students said the same of members of the GOP.

Donald Trump’s victory in the 2024 presidential election had many causes, including inflation and frustration with Joe Biden’s leadership. But one source of Trump’s success may be that he is an avatar of the all-tribe, no-village style of performative confrontation. He stokes out-group animosity, and speaks to voters who are furiously intolerant of political difference. To cite just a few examples from the campaign, Trump called Democrats “enemies of the democracy” and the news media “enemies of the people,” and promised to “root out” the “radical-left thugs that live like vermin within the confines of our country, that lie and steal and cheat on elections.”

Max Guther

Social disconnection also helps explain progressives’ stubborn inability to understand Trump’s appeal. In the fall, one popular Democratic lawn sign read Harris Walz: Obviously. That sentiment, rejected by a majority of voters, indicates a failure to engage with the world as it really is. Dunkelman emailed me after the election to lament Democratic cluelessness. “How did those of us who live in elite circles not see how Trump was gaining popularity even among our literal neighbors?” he wrote. Too many progressives were mainlining left-wing media in the privacy of their home, oblivious that families down the street were drifting right. Even in the highly progressive borough of Brooklyn, New York, three in 10 voters chose Trump. If progressives still consider MAGA an alien movement, it is in part because they have made themselves strangers in their own land.

Practicing politics alone, on the internet, rather than in community isn’t only making us more likely to demonize and alienate our opponents, though that would be bad enough. It may also be encouraging deep nihilism. In 2018, a group of researchers led by Michael Bang Petersen, a Danish political scientist, began asking Americans to evaluate false rumors about Democratic and Republican politicians, including Trump and Hillary Clinton. “We were expecting a clear pattern of polarization,” Petersen told me, with people on the left sharing conspiracies about the right and vice versa. But some participants seemed drawn to any conspiracy theory so long as it was intended to destroy the established order. Members of this cohort commonly harbored racial or economic grievances. Perhaps more important, Petersen said, they tended to feel socially isolated. These aggravated loners agreed with many dark pronouncements, such as “I need chaos around me” and “When I think about our political and social institutions, I cannot help thinking ‘just let them all burn.’ ” Petersen and his colleagues coined a term to describe this cohort’s motivation: the need for chaos.

[Read: Derek Thompson on the Americans who need chaos]

Although chaotically inclined individuals score highly in a popular measure for loneliness, they don’t seem to seek the obvious remedy. “What they’re reaching out to get isn’t friendship at all but rather recognition and status,” Petersen said. For many socially isolated men in particular, for whom reality consists primarily of glowing screens in empty rooms, a vote for destruction is a politics of last resort—a way to leave one’s mark on a world where collective progress, or collective support of any kind, feels impossible.

The Introversion Delusion

Let us be fair to solitude, for a moment. As the father of a young child, I know well that a quiet night alone can be a balm. I have spent evenings alone at a bar, watching a baseball game, that felt ecstatically close to heaven. People cope with stress and grief and mundane disappointment in complex ways, and sometimes isolation is the best way to restore inner equilibrium.

But the dosage matters. A night alone away from a crying baby is one thing. A decade or more of chronic social disconnection is something else entirely. And people who spend more time alone, year after year, become meaningfully less happy. In his 2023 paper on the rise of 21st-century solitude, Atalay, at the Philadelphia Fed, calculated that by one measure, sociability means considerably more for happiness than money does: A five-percentage-point increase in alone time was associated with about the same decline in life satisfaction as was a 10 percent lower household income.

Max Guther

Nonetheless, many people keep choosing to spend free time alone, in their home, away from other people. Perhaps, one might think, they are making the right choice; after all, they must know themselves best. But a consistent finding of modern psychology is that people often don’t know what they want, or what will make them happy. The saying that “predictions are hard, especially about the future” applies with special weight to predictions about our own life. Time and again, what we expect to bring us peace—a bigger house, a luxury car, a job with twice the pay but half the leisure—only creates more anxiety. And at the top of this pile of things we mistakenly believe we want, there is aloneness.

[From the May 2012 issue: Is Facebook making us lonely?]

Several years ago, Nick Epley, a psychologist at the University of Chicago’s Booth School of Business, asked commuter-train passengers to make a prediction: How would they feel if asked to spend the ride talking with a stranger? Most participants predicted that quiet solitude would make for a better commute than having a long chat with someone they didn’t know. Then Epley’s team created an experiment in which some people were asked to keep to themselves, while others were instructed to talk with a stranger (“The longer the conversation, the better,” participants were told). Afterward, people filled out a questionnaire. How did they feel? Despite the broad assumption that the best commute is a silent one, the people instructed to talk with strangers actually reported feeling significantly more positive than those who’d kept to themselves. “A fundamental paradox at the core of human life is that we are highly social and made better in every way by being around people,” Epley said. “And yet over and over, we have opportunities to connect that we don’t take, or even actively reject, and it is a terrible mistake.”

Researchers have repeatedly validated Epley’s discovery. In 2020, the psychologists Seth Margolis and Sonja Lyubomirsky, at UC Riverside, asked people to behave like an extrovert for one week and like an introvert for another. Subjects received several reminders to act “assertive” and “spontaneous” or “quiet” and “reserved” depending on the week’s theme. Participants said they felt more positive emotions at the end of the extroversion week and more negative emotions at the end of the introversion week. Our modern economy, with its home-delivery conveniences, manipulates people into behaving like agoraphobes. But it turns out that we can be manipulated in the opposite direction. And we might be happier for it.

Our “mistaken” preference for solitude could emerge from a misplaced anxiety that other people aren’t that interested in talking with us, or that they would find our company bothersome. “But in reality,” Epley told me, “social interaction is not very uncertain, because of the principle of reciprocity. If you say hello to someone, they’ll typically say hello back to you. If you give somebody a compliment, they’ll typically say thank you.” Many people, it seems, are not social enough for their own good. They too often seek comfort in solitude, when they would actually find joy in connection.

Despite a consumer economy that seems optimized for introverted behavior, we would have happier days, years, and lives if we resisted the undertow of the convenience curse—if we talked with more strangers, belonged to more groups, and left the house for more activities.

The AI Century

The anti-social century has been bad enough: more anxiety and depression; more “need for chaos” in our politics. But I’m sorry to say that our collective detachment could still get worse. Or, to be more precise, weirder.

In May of last year, three employees of OpenAI, the artificial-intelligence company, sat onstage to introduce ChatGPT’s new real-time conversational-speech feature. A research scientist named Mark Chen held up a phone and, smiling, started speaking to it.

“Hey, ChatGPT, I’m Mark. How are you?” Mark said.

“Hello, Mark!” a cheery female voice responded.

“Hey, so I’m onstage right now,” Mark said. “I’m doing a live demo, and frankly I’m feeling a little bit nervous. Can you help me calm my nerves a little bit?”

“Oh, you’re doing a live demo right now?” the voice replied, projecting astonishment with eerie verisimilitude. “That’s awesome! Just take a deep breath and remember: You’re the expert here.”

Mark asked for feedback on his breathing, before panting loudly, like someone who’d just finished a marathon.

“Whoa, slow!” the voice responded. “Mark, you’re not a vacuum cleaner!” Out of frame, the audience laughed. Mark tried breathing audibly again, this time more slowly and deliberately.

“That’s it,” the AI responded. “How do you feel?”

“I feel a lot better,” Mark said. “Thank you so much.”

AI’s ability to speak naturally might seem like an incremental update, as subtle as a camera-lens refinement on a new iPhone. But according to Nick Epley, fluent speech represents a radical advancement in the technology’s ability to encroach on human relationships.

“Once an AI can speak to you, it’ll feel extremely real,” he said, because people process spoken word more intimately and emotionally than they process text. For a study published in 2020, Epley and Amit Kumar, a psychologist at the University of Texas at Austin, randomly assigned participants to contact an old friend via phone or email. Most people said they preferred to send a written message. But those instructed to talk on the phone reported feeling “a significantly stronger bond” with their friend, and a stronger sense that they’d “really connected,” than those who used email.

Speech is rich with what are known as “paralinguistic cues,” such as emphasis and intonation, which can build sympathy and trust in the minds of listeners. In another study, Epley and the behavioral scientist Juliana Schroeder found that employers and potential recruiters were more likely to rate candidates as “more competent, thoughtful, and intelligent” when they heard a why-I’m-right-for-this-job pitch rather than read it.

Even now, before AI has mastered fluent speech, millions of people are already forming intimate relationships with machines, according to Jason Fagone, a journalist who is writing a book about the emergence of AI companions. Character.ai, the most popular platform for AI companions, has tens of millions of monthly users, who spend an average of 93 minutes a day chatting with their AI friend. “No one is getting duped into thinking they’re actually talking to humans,” Fagone told me. “People are freely choosing to enter relationships with artificial partners, and they’re getting deeply attached anyway, because of the emotional capabilities of these systems.” One subject in his book is a young man who, after his fiancée’s death, engineers an AI chatbot to resemble his deceased partner. Another is a bisexual mother who supplements her marriage to a man with an AI that identifies as a woman.

If you find the notion of emotional intercourse with an immaterial entity creepy, consider the many friends and family members who exist in your life mainly as words on a screen. Digital communication has already prepared us for AI companionship, Fagone said, by transforming many of our physical-world relationships into a sequence of text chimes and blue bubbles. “I think part of why AI-companion apps have proven so seductive so quickly is that most of our relationships already happen exclusively through the phone,” he said.

Epley sees the exponential growth of AI companions as a real possibility. “You can set them up to never criticize you, never cheat on you, never have a bad day and insult you, and to always be interested in you.” Unlike the most patient spouses, they could tell us that we’re always right. Unlike the world’s best friend, they could instantly respond to our needs without the all-too-human distraction of having to lead their own life.

“The horrifying part, of course, is that learning how to interact with real human beings who can disagree with you and disappoint you” is essential to living in the world, Epley said. I think he’s right. But Epley was born in the 1970s. I was born in the 1980s. People born in the 2010s, or the 2020s, might not agree with us about the irreplaceability of “real human” friends. These generations may discover that what they want most from their relationships is not a set of people, who might challenge them, but rather a set of feelings—sympathy, humor, validation—that can be more reliably drawn out from silicon than from carbon-based life forms. Long before technologists build a superintelligent machine that can do the work of so many Einsteins, they may build an emotionally sophisticated one that can do the work of so many friends.

The Next 15 Minutes

The anti-social century is as much a result of what’s happened to the exterior world of concrete and steel as it is about advances inside our phones. The decline of government investments in what Eric Klinenberg calls “social infrastructure”—public spaces that shape our relationship to the world—may have begun in the latter part of the 20th century, but it has continued in the 21st. That has arguably affected nearly everyone, but less advantaged Americans most of all.

“I can’t tell you how many times I’ve gone to poor neighborhoods in big cities, and the community leaders tell me the real crisis for poor teenagers is that there’s just not much for them to do anymore, and nowhere to go,” Klinenberg told me. “I’d like to see the government build social infrastructure for teenagers with the creativity and generosity with which video-game companies build the toys that keep them inside. I’m thinking of athletic fields, and public swimming pools, and libraries with beautiful social areas for young people to hang out together.”

Improved public social infrastructure would not solve all the problems of the anti-social century. But degraded public spaces—and degraded public life—are in some ways the other side of all our investments in video games and phones and bigger, better private space. Just as we needed time to see the invisible emissions of the Industrial Revolution, we are only now coming to grips with the negative externalities of a phonebound and homebound world. The media theorist Marshall McLuhan once said of technology that every augmentation is also an amputation. We chose our digitally enhanced world. We did not realize the significance of what was being amputated.

Max Guther

But we can choose differently. In his 2015 novel, Seveneves, Neal Stephenson coined the term Amistics to describe the practice of carefully selecting which technologies to accept. The word is a reference to the Amish, who generally shun many modern innovations, including cars and television. Although they are sometimes considered strictly anti-modern, many Amish communities have refrigerators and washing machines, and some use solar power. Instead of dismissing all technology, the Amish adopt only those innovations that support their religious and communal values. In his 1998 dissertation on one Amish community, Tay Keong Tan, then a Ph.D. candidate at Harvard, quoted a community member as saying that they didn’t want to adopt TV or radio, because those products “would destroy our visiting practices. We would stay at home with the television or radio rather than meet with other people.”

If the Amish approach to technology is radical in its application, it recognizes something plain and true: Although technology does not have values of its own, its adoption can create values, even in the absence of a coordinated effort. For decades, we’ve adopted whatever technologies removed friction or increased dopamine, embracing what makes life feel easy and good in the moment. But dopamine is a chemical, not a virtue. And what’s easy is not always what’s best for us. We should ask ourselves: What would it mean to select technology based on long-term health rather than instant gratification? And if technology is hurting our community, what can we do to heal it?

A seemingly straightforward prescription is that teenagers should choose to spend less time on their phone, and their parents should choose to invite more friends over for dinner. But in a way, these are collective-action problems. A teenager is more likely to get out of the house if his classmates have already made a habit of hanging out. That teen’s parents are more likely to host if their neighbors have also made a habit of weekly gatherings. There is a word for such deeply etched communal habits: rituals. And one reason, perhaps, that the decline of socializing has synchronized with the decline of religion is that nothing has proved as adept at inscribing ritual into our calendars as faith.

“I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school.

When Epley and his lab asked Chicagoans to overcome their preference for solitude and talk with strangers on a train, the experiment probably didn’t change anyone’s life. All it did was marginally improve the experience of one 15-minute block of time. But life is just a long set of 15-minute blocks, one after another. The way we spend our minutes is the way we spend our decades. “No amount of research that I’ve done has changed my life more than this,” Epley told me. “It’s not that I’m never lonely. It’s that my moment-to-moment experience of life is better, because I’ve learned to take the dead space of life and make friends in it.”

This article appears in the February 2025 print edition with the headline “The Anti-Social Century.”

How Hitler Dismantled a Democracy in 53 Days

The Atlantic

www.theatlantic.com › ideas › archive › 2025 › 01 › hitler-germany-constitution-authoritarianism › 681233

This story seems to be about:

Ninety-two years ago this month, on Monday morning, January 30, 1933, Adolf Hitler was appointed the 15th chancellor of the Weimar Republic. In one of the most astonishing political transformations in the history of democracy, Hitler set about destroying a constitutional republic through constitutional means. What follows is a step-by-step account of how Hitler systematically disabled and then dismantled his country’s democratic structures and processes in less than two months’ time—specifically, one month, three weeks, two days, eight hours, and 40 minutes. The minutes, as we will see, mattered.

Hans Frank served as Hitler’s private attorney and chief legal strategist in the early years of the Nazi movement. While later awaiting execution at Nuremberg for his complicity in Nazi atrocities, Frank commented on his client’s uncanny capacity for sensing “the potential weakness inherent in every formal form of law” and then ruthlessly exploiting that weakness. Following his failed Beer Hall Putsch of November 1923, Hitler had renounced trying to overthrow the Weimar Republic by violent means but not his commitment to destroying the country’s democratic system, a determination he reiterated in a Legalitätseid—“legality oath”—before the Constitutional Court in September 1930. Invoking Article 1 of the Weimar constitution, which stated that the government was an expression of the will of the people, Hitler informed the court that once he had achieved power through legal means, he intended to mold the government as he saw fit. It was an astonishingly brazen statement.

“So, through constitutional means?” the presiding judge asked.

“Jawohl!” Hitler replied.

By January 1933, the fallibilities of the Weimar Republic—whose 181-article constitution framed the structures and processes for its 18 federated states—were as obvious as they were abundant. Having spent a decade in opposition politics, Hitler knew firsthand how easily an ambitious political agenda could be scuttled. He had been co-opting or crushing right-wing competitors and paralyzing legislative processes for years, and for the previous eight months, he had played obstructionist politics, helping to bring down three chancellors and twice forcing the president to dissolve the Reichstag and call for new elections.

When he became chancellor himself, Hitler wanted to prevent others from doing unto him what he had done unto them. Though the vote share of his National Socialist party had been rising—in the election of September 1930, following the 1929 market crash, they had increased their representation in the Reichstag almost ninefold, from 12 delegates to 107, and in the July 1932 elections, they had more than doubled their mandate to 230 seats—they were still far from a majority. Their seats amounted to only 37 percent of the legislative body, and the larger right-wing coalition that the Nazi Party was a part of controlled barely 51 percent of the Reichstag, but Hitler believed that he should exercise absolute power: “37 percent represents 75 percent of 51 percent,” he argued to one American reporter, by which he meant that possessing the relative majority of a simple majority was enough to grant him absolute authority. But he knew that in a multiparty political system, with shifting coalitions, his political calculus was not so simple. He believed that an Ermächtigungsgesetz (“empowering law”) was crucial to his political survival. But passing such a law—which would dismantle the separation of powers, grant Hitler’s executive branch the authority to make laws without parliamentary approval, and allow Hitler to rule by decree, bypassing democratic institutions and the constitution—required the support of a two-thirds majority in the fractious Reichstag.

The process proved to be even more challenging than anticipated. Hitler found his dictatorial intentions getting thwarted within his first six hours as chancellor. At 11:30 that Monday morning, he swore an oath to uphold the constitution, then went across the street to the Hotel Kaiserhof for lunch, then returned to the Reich Chancellery for a group photo of the “Hitler Cabinet,” which was followed by his first formal meeting with his nine ministers at precisely 5 o’clock.

Hitler opened the meeting by boasting that millions of Germans had welcomed his chancellorship with “jubilation,” then outlined his plans for expunging key government officials and filling their positions with loyalists. At this point he turned to his main agenda item: the empowering law that, he argued, would give him the time (four years, according to the stipulations laid out in the draft of the law) and the authority necessary to make good on his campaign promises to revive the economy, reduce unemployment, increase military spending, withdraw from international treaty obligations, purge the country of foreigners he claimed were “poisoning” the blood of the nation, and exact revenge on political opponents. “Heads will roll in the sand,” Hitler had vowed at one rally.

[From the March 1932 issue: Hitler and Hitlerism: a man of destiny]

But given that Social Democrats and Communists collectively commanded 221 seats, or roughly 38 percent, of the 584-seat Reichstag, the two-thirds vote Hitler needed was a mathematical impossibility. “Now if one were to ban the Communist Party and annul their votes,” Hitler proposed, “it would be possible to reach a Reichstag majority.”

The problem, Hitler continued, was that this would almost certainly precipitate a national strike by the 6 million German Communists, which could, in turn, lead to a collapse of the country’s economy. Alternatively, Reichstag percentages could be rebalanced by holding new elections. “What represents a greater danger to the economy?” Hitler asked. “The uncertainties and concerns associated with new elections or a general strike?” Calling for new elections, he concluded, was the safer path.

Economic Minister Alfred Hugenberg disagreed. Ultimately, Hugenberg argued, if one wanted to achieve a two-thirds Reichstag majority, there was no way of getting around banning the Communist Party. Of course, Hugenberg had his own self-interested reasons for opposing new Reichstag elections: In the previous election, Hugenberg had siphoned 14 seats from Hitler’s National Socialists to his own party, the German Nationalists, making Hugenberg an indispensable partner in Hitler’s current coalition government. New elections threatened to lose his party seats and diminish his power.

When Hitler wondered whether the army could be used to crush any public unrest, Defense Minister Werner von Blomberg dismissed the idea out of hand, observing “that a soldier was trained to see an external enemy as his only potential opponent.” As a career officer, Blomberg could not imagine German soldiers being ordered to shoot German citizens on German streets in defense of Hitler’s (or any other German) government.

Hitler had campaigned on the promise of draining the “parliamentarian swamp”—den parlamentarischen Sumpf—only to find himself now foundering in a quagmire of partisan politics and banging up against constitutional guardrails. He responded as he invariably did when confronted with dissenting opinions or inconvenient truths: He ignored them and doubled down.

The next day, Hitler announced new Reichstag elections, to be held in early March, and issued a memorandum to his party leaders. “After a thirteen-year struggle the National Socialist movement has succeeded in breaking through into the government, but the struggle to win the German nation is only beginning,” Hitler proclaimed, and then added venomously: “The National Socialist party knows that the new government is not a National Socialist government, even though it is conscious that it bears the name of its leader, Adolf Hitler.” He was declaring war on his own government.

We have come to perceive Hitler’s appointment as chancellor as part of an inexorable rise to power, an impression resting on generations of postwar scholarship, much of which has necessarily marginalized or disregarded alternatives to the standard narrative of the Nazi seizure of power (Machtergreifung) with its political and social persecutions, its assertion of totalitarian rule (Gleichschaltung) and subsequent aggressions that led to the Second World War and the nightmare of the Holocaust. In researching and writing this piece, I intentionally ignored these ultimate outcomes and instead traced events as they unfolded in real time with their attendant uncertainties and misguided assessments. A case in point: The January 31, 1933, New York Times story on Hitler’s appointment as chancellor was headlined “Hitler Puts Aside Aim to Be Dictator.”

In the late 1980s, as a graduate student at Harvard, where I served as a teaching fellow in a course on Weimar and Nazi Germany, I used to cite a postwar observation, made by Hans Frank in Nuremberg, that underscored the tenuous nature of Hitler’s political career. “The Führer was a man who was possible in Germany only at that very moment,” the Nazi legal strategist recalled. “He came at exactly this terrible transitory period when the monarchy had gone and the republic was not yet secure.” Had Hitler’s predecessor in the chancellery, Kurt von Schleicher, remained in office another six months, or had German President Paul von Hindenburg exercised his constitutional powers more judiciously, or had a faction of moderate conservative Reichstag delegates cast their votes differently, then history may well have taken a very different turn. My most recent book, Takeover: Hitler’s Final Rise to Power, ends at the moment the story this essay tells begins. Both Hitler’s ascendancy to chancellor and his smashing of the constitutional guardrails once he got there, I have come to realize, are stories of political contingency rather than historical inevitability.

Hitler’s appointment as chancellor of the country’s first democratic republic came almost as much as a surprise to Hitler as it did to the rest of the country. After a vertiginous three-year political ascent, Hitler had taken a shellacking in the November 1932 elections, shedding 2 million votes and 34 Reichstag seats, almost half of them to Hugenberg’s German Nationalists. By December 1932, Hitler’s movement was bankrupt financially, politically, ideologically. Hitler told several close associates that he was contemplating suicide.

But a series of backroom deals that included the shock weekend dismissal of Chancellor Schleicher in late January 1933 hurtled Hitler into the chancellery. Schleicher would later remember Hitler telling him that “it was astonishing in his life that he was always rescued just when he himself had given up all hope.”

[Thomas Weber: Hitler would have been astonished]

The eleventh-hour appointment came at a steep political price. Hitler had left several of his most loyal lieutenants as political roadkill on this unexpected fast lane to power. Worse, he found himself with a cabinet handpicked by a political enemy, former Chancellor Franz von Papen, whose government Hitler had helped topple and who now served as Hitler’s vice chancellor. Worst of all, Hitler was hostage to Hugenberg, who commanded 51 Reichstag votes along with the power to make or break Hitler’s chancellorship. He nearly broke it.

As President Hindenburg waited to receive Hitler on that Monday morning in January 1933, Hugenberg clashed with Hitler over the issue of new Reichstag elections. Hugenberg’s position: “Nein! Nein! Nein!” While Hitler and Hugenberg argued in the foyer outside the president’s office, Hindenburg, a military hero of World War I who had served as the German president since 1925, grew impatient. According to Otto Meissner, the president’s chief of staff, had the Hitler-Hugenberg squabble lasted another few minutes, Hindenburg would have left. Had this occurred, the awkward coalition cobbled together by Papen in the previous 48 hours would have collapsed. There would have been no Hitler chancellorship, no Third Reich.

In the event, Hitler was given a paltry two cabinet posts to fill—and none of the most important ones pertaining to the economy, foreign policy, or the military. Hitler chose Wilhelm Frick as minister of the interior and Hermann Göring as minister without portfolio. But with his unerring instinct for detecting the weaknesses in structures and processes, Hitler put his two ministers to work targeting the Weimar Republic’s key democratic pillars: free speech, due process, public referendum, and states’ rights.

Frick had responsibility over the republic’s federated system, as well as over the country’s electoral system and over the press. Frick was the first minister to reveal the plans of Hitler’s government: “We will present an enabling law to the Reichstag that in accordance with the constitution will dissolve the Reich government,” Frick told the press, explaining that Hitler’s ambitious plans for the country required extreme measures, a position Hitler underscored in his first national radio address on February 1. “The national government will therefore regard it as its first and supreme task to restore to the German people unity of mind and will,” Hitler said. “It will preserve and defend the foundations on which the strength of our nation rests.”

Frick was also charged with suppressing the opposition press and centralizing power in Berlin. While Frick was undermining states’ rights and imposing bans on left-wing newspapers—including the Communist daily The Red Banner and the Social Democratic Forward—Hitler also appointed Göring as acting state interior minister of Prussia, the federated state that represented two-thirds of German territory. Göring was tasked with purging the Prussian state police, the largest security force in the country after the army, and a bastion of Social Democratic sentiment.

Rudolf Diels was the head of Prussia’s political police. One day in early February, Diels was sitting in his office, at 76 Unter den Linden, when Göring knocked at his door and told him in no uncertain terms that it was time to clear house. “I want nothing to do with these scoundrels who are sitting around here in this place,” Göring said.

A Schiesserlass, or “shooting decree,” followed. This permitted the state police to shoot on sight without fearing consequences. “I cannot rely on police to go after the red mob if they have to worry about facing disciplinary action when they are simply doing their job,” Göring explained. He accorded them his personal backing to shoot with impunity. “When they shoot, it is me shooting,” Göring said. “When someone is lying there dead, it is I who shot them.”

Göring also designated the Nazi storm troopers as Hilfspolizei, or “deputy police,” compelling the state to provide the brownshirt thugs with sidearms and empowering them with police authority in their street battles. Diels later noted that this—manipulating the law to serve his ends and legitimizing the violence and excesses of tens of thousands of brownshirts—was a “well-tested Hitler tactic.”

As Hitler scrambled to secure power and crush the opposition, rumors circulated of his government’s imminent demise. One rumor held that Schleicher, the most recently deposed chancellor, was planning a military coup. Another said that Hitler was a puppet of Papen and a backwoods Austrian boy in the unwitting service of German aristocrats. Still others alleged that Hitler was merely a brownshirt strawman for Hugenberg and a conspiracy of industrialists who intended to dismantle worker protections for the sake of higher profits. (The industrialist Otto Wolff was said to have “cashed in” on his financing of Hitler’s movement.) Yet another rumor had it that Hitler was merely managing a placeholder government while President Hindenburg, a monarchist at heart, prepared for the return of the Kaiser.

There was little truth to any of this, but Hitler did have to confront the political reality of making good on his campaign promises to frustrated German voters in advance of the March Reichstag elections. The Red Banner published a list of Hitler’s campaign promises to workers, and the Center Party publicly demanded assurances that Hitler would support the agricultural sector, fight inflation, avoid “financial-political experiments,” and adhere to the Weimar constitution. At the same time, the dismay among right-wing supporters who had applauded Hitler’s earlier demand for dictatorial power and refusal to enter into a coalition was distilled in the pithy observation “No Third Reich, not even 2½.”

On February 18, the center-left newspaper Vossische Zeitung wrote that despite Hitler’s campaign promises and political posturing, nothing had changed for the average German. If anything, things had gotten worse. Hitler’s promise of doubling tariffs on grain imports had gotten tangled in complexities and contractual obligations. Hugenberg informed Hitler during a cabinet meeting that the “catastrophic economic conditions” were threatening the very “existence of the country.” “In the end,” Vossische Zeitung predicted, “the survival of the new government will rely not on words but on the economic conditions.” For all Hitler’s talk of a thousand-year Reich, there was no certainty his government would last the month.

Over the eight months before appointing Hitler as chancellor, Hindenberg had dispatched three others—Heinrich Brüning, Papen, and Schleicher—from the role, exercising his constitutional authority embedded in Article 53. And his disdain for Hitler was common knowledge. The previous August, he had declared publicly that, “for the sake of God, my conscience, and the country,” he would never appoint Hitler as chancellor. Privately, Hindenburg had quipped that if he were to appoint Hitler to any position, it would be as postmaster general, “so he can lick me from behind on my stamps.” In January, Hindenburg finally agreed to appoint Hitler, but with great reluctance—and on the condition that he never be left alone in a room with his new chancellor. By late February, the question on everyone’s mind was, as Forward put it, how much longer would the aging field marshal put up with his Bohemian corporal?

That Forward article appeared on Saturday morning, February 25, under the headline “How Long?” Two days later, on Monday evening, shortly before 9 p.m., the Reichstag erupted in flames, sheafs of fire collapsing the glass dome of the plenary hall and illuminating the night sky over Berlin. Witnesses recall seeing the fire from villages 40 miles away. The image of the seat of German parliamentary democracy going up in flames sent a collective shock across the country. The Communists blamed the National Socialists. The National Socialists blamed the Communists. A 23-year-old Dutch Communist, Marinus van der Lubbe, was caught in flagrante, but the Berlin fire chief, Walter Gempp, who supervised the firefighting operation, saw evidence of potential Nazi involvement.

[From the May 1944 issue: What is German?]

When Hitler convened his cabinet to discuss the crisis the next morning, he declared that the fire was clearly part of a Communist coup attempt. Göring detailed Communist plans for further arson attacks on public buildings, as well as for the poisoning of public kitchens and the kidnapping of the children and wives of prominent officials. Interior Minister Frick presented a draft decree suspending civil liberties, permitting searches and seizures, and curbing states’ rights during a national emergency.

Papen expressed concern that the proposed draft “could meet with resistance,” especially from “southern states,” by which he meant Bavaria, which was second only to Prussia in size and power. Perhaps, Papen suggested, the proposed measures should be discussed with state governments to assure “an amicable agreement,” otherwise the measures could be seen as the usurpation of states’ rights. Ultimately, only one word was added to suggest contingencies for suspending a state’s rights. Hindenburg signed the decree into law that afternoon.

Put into effect just a week before the March elections, the emergency decree gave Hitler tremendous power to intimidate—and imprison—the political opposition. The Communist Party was banned (as Hitler had wanted since his first cabinet meeting), and members of the opposition press were arrested, their newspapers shut down. Göring had already been doing this for the past month, but the courts had invariably ordered the release of detained people. With the decree in effect, the courts could not intervene. Thousands of Communists and Social Democrats were rounded up.

On Sunday morning, March 5, one week after the Reichstag fire, German voters went to the polls. “No stranger election has perhaps ever been held in a civilized country,” Frederick Birchall wrote that day in The New York Times. Birchall expressed his dismay at the apparent willingness of Germans to submit to authoritarian rule when they had the opportunity for a democratic alternative. “In any American or Anglo-Saxon community the response would be immediate and overwhelming,” he wrote.

More than 40 million Germans went to the polls, which was more than 2 million more than in any previous election, representing nearly 89 percent of the registered voters—a stunning demonstration of democratic engagement. “Not since the German Reichstag was founded in 1871 has there been such a high voter turnout,” Vossische Zeitung reported. Most of those 2 million new votes went to the Nazis. “The enormous voting reserves almost entirely benefited the National Socialists,” Vossische Zeitung reported.

Although the National Socialists fell short of Hitler’s promised 51 percent, managing only 44 percent of the electorate—despite massive suppression, the Social Democrats lost just a single Reichstag seat—the banning of the Communist Party positioned Hitler to form a coalition with the two-thirds Reichstag majority necessary to pass the empowering law.

The next day, the National Socialists stormed state-government offices across the country. Swastika banners were hung from public buildings. Opposition politicians fled for their lives. Otto Wels, the Social Democratic leader, departed for Switzerland. So did Heinrich Held, the minister-president of Bavaria. Tens of thousands of political opponents were taken into Schutzhaft (“protective custody”), a form of detention in which an individual could be held without cause indefinitely.

Hindenburg remained silent. He did not call his new chancellor to account for the violent public excesses against Communists, Social Democrats, and Jews. He did not exercise his Article 53 powers. Instead, he signed a decree permitting the National Socialists’ swastika banner to be flown beside the national colors. He acceded to Hitler’s request to create a new cabinet position, minister of public enlightenment and propaganda, a role promptly filled by Joseph Goebbels. “What good fortune for all of us to know that this towering old man is with us,” Goebbels wrote of Hindenburg in his diary, “and what a change of fate that we are now moving on the same path together.”

A week later, Hindenburg’s embrace of Hitler was on full public display. He appeared in military regalia in the company of his chancellor, who was wearing a dark suit and long overcoat, at a ceremony in Potsdam. The former field marshal and the Bohemian corporal shook hands. Hitler bowed in putative deference. The “Day of Potsdam” signaled the end of any hope for an Article 53 solution to the Hitler chancellorship.

That same Tuesday, March 21, an Article 48 decree was issued amnestying National Socialists convicted of crimes, including murder, perpetrated “in the battle for national renewal.” Men convicted of treason were now national heroes. The first concentration camp was opened that afternoon, in an old brewery near the town center of Oranienburg, just north of Berlin. The following day, the first group of detainees arrived at another concentration camp, in an abandoned munition plant outside the Bavarian town of Dachau.

Plans for legislation excluding Jews from the legal and medical professions, as well as from government offices, were under way, though Hitler’s promise for the mass deportation of the country’s 100,000 Ostjuden, Jewish immigrants from Eastern Europe, was proving to be more complicated. Many had acquired German citizenship and were gainfully employed. As fear of deportation rose, a run on local banks caused other banks and businesses to panic. Accounts of Jewish depositors were frozen until, as one official explained, “they had settled their obligations with German business men.” Hermann Göring, now president of the newly elected Reichstag, sought to calm matters, assuring Germany’s Jewish citizens that they retained the same “protection of law for person and property” as every other German citizen. He then berated the international community: Foreigners were not to interfere with the domestic affairs of the country. Germany would do with its citizens whatever it deemed appropriate.

Adolf Hitler's address to the Reichstag on March 23, 1933, at the Kroll Opera House. On this day, a majority of the delegates voted to eliminate almost all constitutional restraints on Hitler’s government. (Ullstein Bild / Getty)

On Thursday, March 23, the Reichstag delegates assembled in the Kroll Opera House, just opposite the charred ruins of the Reichstag. The following Monday, the traditional Reich eagle had been removed and replaced with an enormous Nazi eagle, dramatically backlit with wings spread wide and a swastika in its talons. Hitler, dressed now in a brown stormtrooper uniform with a swastika armband, arrived to pitch his proposed enabling law, now formally titled the “Law to Remedy the Distress of the People and the Reich.” At 4:20 p.m., he stepped up to the podium. Appearing uncharacteristically ill at ease, he shuffled a sheaf of pages before beginning to read haltingly from a prepared text. Only gradually did he assume his usual animated rhetorical style. He enumerated the failings of the Weimar Republic, then outlined his plans for the four-year tenure of his proposed enabling law, which included restoring German dignity and military parity abroad as well as economic and social stability at home. “Treason toward our nation and our people shall in the future be stamped out with ruthless barbarity,” Hitler vowed.

[Read: Trump: ‘I need the kind of generals that Hitler had’]

The Reichstag recessed to deliberate on the act. When the delegates reconvened at 6:15 that evening, the floor was given to Otto Wels, the Social Democratic leader, who had returned from his Swiss exile, despite fears for his personal safety, to challenge Hitler in person. As Wels began to speak, Hitler made a move to rise. Papen touched Hitler’s wrist to keep him in check.

“In this historic hour, we German Social Democrats solemnly pledge ourselves to the principles of humanity and justice, of freedom and socialism,” Wels said. He chided Hitler for seeking to undermine the Weimar Republic, and for the hatred and divisiveness he had sowed. Regardless of the evils Hitler intended to visit on the country, Wels declared, the republic’s founding democratic values would endure. “No enabling act gives you the power to destroy ideas that are eternal and indestructible,” he said.

Hitler rose. “The nice theories that you, Herr Delegate, just proclaimed are words that have come a bit too late for world history,” he began. He dismissed allegations that he posed any kind of threat to the German people. He reminded Wels that the Social Democrats had had 13 years to address the issues that really mattered to the German people—employment, stability, dignity. “Where was this battle during the time you had the power in your hand?” Hitler asked. The National Socialist delegates, along with observers in the galleries, cheered. The rest of the delegates remained still. A series of them rose to state both their concerns and positions on the proposed enabling law.

The Centrists, as well as the representatives of the Bavarian People’s Party, said they were willing to vote yes despite reservations “that in normal times could scarcely have been overcome.” Similarly, Reinhold Maier, the leader of the German State Party, expressed concern about what would happen to judicial independence, due process, freedom of the press, and equal rights for all citizens under the law, and stated that he had “serious reservations” about according Hitler dictatorial powers. But then he announced that his party, too, was voting in favor of the law, eliciting laughter from the floor.

Shortly before 8 o’clock that evening, the voting was completed. The 94 Social Democrat delegates who were in attendance cast their votes against the law. (Among the Social Democrats was the former interior minister of Prussia, Carl Severing, who had been arrested earlier in the day as he was about to enter the Reichstag but was released temporarily in order to cast his vote.) The remaining Reichstag delegates, 441 in all, voted in favor of the new law, delivering Hitler a four-fifths majority, more than enough to put the enabling law into effect without amendment or restriction. The next morning, U.S. Ambassador Frederic Sackett sent a telegram to the State Department: “On the basis of this law the Hitler Cabinet can reconstruct the entire system of government as it eliminates practically all constitutional restraints.”

Joseph Goebbels, who was present that day as a National Socialist Reichstag delegate, would later marvel that the National Socialists had succeeded in dismantling a federated constitutional republic entirely through constitutional means. Seven years earlier, in 1926, after being elected to the Reichstag as one the first 12 National Socialist delegates, Goebbels had been similarly struck: He was surprised to discover that he and these 11 other men (including Hermann Göring and Hans Frank), seated in a single row on the periphery of a plenary hall in their brown uniforms with swastika armbands, had—even as self-declared enemies of the Weimar Republic—been accorded free first-class train travel and subsidized meals, along with the capacity to disrupt, obstruct, and paralyze democratic structures and processes at will. “The big joke on democracy,” he observed, “is that it gives its mortal enemies the means to its own destruction.”