Itemoids

American

American Democracy Requires a Conservative Party

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 09 › america-us-democracy-conservative-party › 675463

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Every nation needs parties of the left and the right, but America’s conservative party has collapsed—and its absence will undermine the recovery of American democracy even when Donald Trump is gone.

First, here are four new stories from The Atlantic:

So much for “learn to codeWhere the new identity politics went wrong The origins of the socialist slur The coming attack on an essential element of women’s freedom

The Danger That Will Outlast Trump

The American right has been busy the past few days. The Republicans in Congress are at war with one another over a possible government shutdown that most of them don’t really want. Representative Paul Gosar of Arizona (channeling the warden from The Shawshank Redemption, apparently) railed about “quislings” such as the “sodomy-promoting” Mark Milley, the chairman of the Joint Chiefs of Staff, and said he should be hanged. Gosar, of course, was merely backing up a similar attack from the likely GOP presidential nominee Donald Trump, who over the weekend floated the idea of executing Milley and swore to use government power to investigate a major television network for “treason.”

Normally, this is the kind of carnival of abominable behavior that would lead me to ask—again—how millions of Americans not only tolerate but support such madness.

But today I’m going to ask a different question: Is this the future of “conservatism”? I admit that I am thinking about this because it’s also one of the questions I’m going to tackle with my colleagues David Frum, Helen Lewis, and Rebecca Rosen on Thursday in Washington, D.C., at The Atlantic Festival, our annual two-day gathering where we explore ideas and cultural trends with a roster of stellar guests.

Slightly more than a year ago, I tried to think through what being a conservative means in the current era of American politics. I have not been a Republican for several years, but I still describe myself as a conservative: I believe in public order as a prerequisite for politics; I respect tradition, and I am reluctant to acquiesce to change too precipitously; I think human nature is fixed rather than malleable; I am suspicious of centralized government power; I distrust mass movements. To contrast these with progressivism, I think most folks on the left, for example, would weigh social justice over abstract commitments to order, be more inclined to see traditions as obstacles to progress, and regard mass protests as generally positive forces.

This is hardly an exhaustive list of conservative views, and some on the right have taken issue with my approach. A young writer at National Review named Nate Hochman took me to task last year for fundamentally misunderstanding modern conservatism. Mr. Hochman, however, was apparently fired this summer from the Ron DeSantis campaign after he produced a campaign video that used Nazi symbolism, which suggests to me that I do, in fact, understand the modern conservative movement better than at least some of my critics might admit.

In any case, the immediate problem America faces is that it no longer has a center-right party that represents traditional conservatism, or even respects basic constitutional principles such as the rule of law. The pressing question for American democracy, then, is not so much the future of conservatism but the future of the Republican Party, another question our panel will discuss—and one that continually depresses me.

The United States, like any other nation, needs political parties that can represent views on the left and the right. The role of the state, the reach of the law, the allocation of social and economic resources—these are all inevitable areas of disagreement, and every functioning democracy needs parties that can contest these issues within the circumscribed limits of a democratic and rights-respecting constitution. Today’s Republican Party rarely exhibits such commitments to the rule of law, constitutionalism, or democracy itself.

The current GOP is not so much conservative as it is reactionary: Today’s right-wing voters are a loose movement of various groups, but especially of white men, obsessed with a supposedly better past in which they were not the aggrieved minority they see themselves as today. These reactionary voters, as I have written recently, are reflexively countercultural: They reject almost everything in the current social and political order because everything around them is the product of the hated now that has displaced the sacred then.

(Although many of my colleagues in academia and in the media see Trumpism as fascism, I remain reticent to use that word … for now. I think it’s inaccurate at the present time, but I also believe the word has been overused for years and people tend to tune it out. I grant, however, that much of the current GOP has become an anti-constitutional leader cult built around Trump—perhaps one of the weakest and unlikeliest men ever in history to have such a following—and could become a genuinely fascist threat soon.)

America needs an actual conservative party, but it is unlikely to produce one in the near future. The movement around Trump will come to an end one way or another; as the writer Peter Sagal noted in The Atlantic after interviewing former members of various cults, “the icy hand of death” will end the Trump cult because it is primarily a movement of older people, and when they die out, “there will be no one, eventually, to replace them.” Although the cult around Trump will someday dissolve, the authoritarians his movement spawned will still be with us, and they will prevent the formation of a sensible center-right party in the United States.

Too many Americans remain complacent, believing that defeating Trump means defeating the entire threat to American democracy. As the Atlantic contributor Brian Klaas wrote yesterday, Trump’s threats on social media against Milley should have been the biggest story in the nation: “Instead, the post barely made the news.” Nor did Gosar’s obscene pile-on get more than a shrug.

Meanwhile, the New York Times opinion writer Michelle Cottle today profiled Ohio Senator J. D. Vance, a man who has called his opponents “degenerate liberals” and who is so empty of character that even Mitt Romney can’t stand him. Cottle, however, noted Vance’s cute socks, and ended with this flourish: “Mr. Trump’s Republican Party is something of a chaotic mess. Until it figures out where it is headed, a shape-shifting MAGA brawler who quietly works across the aisle on particular issues may be the best this party has to offer.”

Something of a mess? That’s one way to put it.

And what about Fox News, the source of continual toxic dumping into the American political ecosystem? “Fox News,” the Washington Post columnist Megan McArdle said yesterday, “does not have nearly as much power over viewers’ minds as progressives think. I am not cutting Fox any slack for amplifying Trump’s election lie nonsense. But I also doubt that it made that much of a difference.” Having traveled the country giving talks about misinformation and democracy for years, and hearing the same stories so many times of people who now find it impossible to talk to their own parents, I have no such doubts.

If Trump wins in 2024, worries about Fox’s influence or reflections on Vance’s adorable socks will seem trivial when Trump unleashes his narcissistic and lawless revenge on the American people. But even if he does not win, America cannot sustain itself without a functional and sane center-right party. So far, the apathy of the public, the fecklessness of the media, and the cynicism of Republican leaders mean that no such party is on the horizon.

Related:

The end will come for the cult of MAGA. Trump floats the idea of executing Joint Chiefs Chairman Milley.

Today’s News

The Supreme Court ruled against an attempt by Alabama Republicans to retain a congressional map with only one majority-Black district. The Federal Trade Commission and 17 states are suing Amazon in a broad antitrust lawsuit that accuses it of monopolistic practices. An increasing number of Senate Democrats is calling for Senator Bob Menendez to resign from Congress following his federal indictment.

Evening Read

Franco Pagetti / VII / Redux

How We Got ‘Democracy Dies in Darkness’

By Martin Baron

I should not have been surprised, but I still marveled at just how little it took to get under the skin of President Donald Trump and his allies. By February 2019, I had been the executive editor of The Washington Post for six years. That month, the newspaper aired a one-minute Super Bowl ad, with a voice-over by Tom Hanks, championing the role of a free press, commemorating journalists killed and captured, and concluding with the Post’s logo and the message “Democracy dies in darkness.” The ad highlighted the strong and often courageous work done by journalists at the Post and elsewhere—including by Fox News’s Bret Baier—because we were striving to signal that this wasn’t just about us and wasn’t a political statement …

Even that simple, foundational idea of democracy was a step too far for the Trump clan. The president’s son Donald Trump Jr. couldn’t contain himself. “You know how MSM journalists could avoid having to spend millions on a #superbowl commercial to gain some undeserved credibility?” he tweeted with typical two-bit belligerence. “How about report the news and not their leftist BS for a change.”

Read the full article.

More From The Atlantic

A new Coca-Cola flavor at the end of the world The Supreme Court needs to make a call on Trump’s eligibility. The next supercontinent could be a terrible, terrible place.

Culture Break

Wilford Harwood / Hulu

Read. In Orphan Bachelors, Fae Myenne Ng explores the true cost of the Chinese Exclusion era through an aching account of her own family.

Watch. The Hulu series The Other Black Girl dramatizes the pains of managing Afro-textured hair—and other people’s perceptions of it.

Play our daily crossword.

P.S.

I’m off to The Atlantic Festival, so I’ll be brief today. But I’ll be back on Friday to talk about Barry Manilow, whom I saw this past week in Las Vegas as he broke Elvis Presley’s record for performances at the venerable Westgate Las Vegas Resort & Casino. If you’re, ah, ready to take a chance again, you might enjoy it, even now, especially as we’ll be talking about the old songs. All the time, until daybreak.

I’m sorry. I promise: no more Manilow puns. See you in a few days.

— Tom

Katherine Hu contributed to this newsletter.

When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic.

The Origins of the Socialist Slur

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 09 › american-socialism-racist-origins › 675453

For years after World War II, the “liberal consensus”—the New Deal idea that the federal government had a role to play in regulating business, providing a basic social safety net, and promoting infrastructure—was a true consensus. It was so widely popular that in 1950, the critic Lionel Trilling wrote of the United States that “liberalism is not only the dominant but even the sole intellectual tradition.”

But the Supreme Court’s 1954 Brown v. Board of Education decision declaring segregation in public schools unconstitutional tied the federal government to ensuring not just economic equality, but also civil rights. Opponents of the liberal consensus argued that the newly active federal government was misusing tax dollars taken from hardworking white men to promote civil rights for “undeserving” Black people. The troops President Dwight Eisenhower sent to Little Rock Central High School in 1957, for example, didn’t come cheap. The government’s defense of civil rights redistributed wealth, they said, and so was virtually socialism.

[Read: An attempt to resegregate Little Rock, of all places]

This intersection of race and economics was not new to the second half of the 20th century. It reached back into the past to resurrect an argument made by former Confederates during the Reconstruction years to overturn federal protection of Black rights after the Civil War.

Some of today’s Republicans are in the process of making that argument reality. Their insistence that all their opponents are socialists goes hand in hand with their effort to suppress Black and brown voting. When former President Donald Trump insists that the country has fallen to communism and “Marxists,” what he’s really saying is that a government in which racial minorities have a say is illegitimate.

The accusation of “socialism” had sharp teeth in the 1950s, as Americans recoiled from the growing influence of the Soviet Union and the rise of Communist China. But Republicans’ use of the word typically had little to do with actual, Bolshevik-style socialism. The theory that the people would rise up and take control of the means of production has never been popular in the United States. The best a Socialist Party candidate has ever done in an American presidential election was when Eugene V. Debs won about 6 percent of the popular vote in 1912.

Rather, in the United States, the political charge of socialism tended to carry a peculiar meaning, one forged in the white-supremacist backlash to Black civil rights in the 1870s.

During the Civil War, the Republicans in charge of the government both created national taxation and abolished legal slavery (except as punishment for crime). For the first time in U.S. history, voting in federal elections had a direct impact on people’s pocketbooks. Then, in 1867, Congress passed the Military Reconstruction Act, extending the vote to Black men in the South. White southerners who hated the idea of Black people using the vote to protect themselves started to terrorize their Black neighbors. Pretending to be the ghosts of dead Confederate soldiers, they dressed in white robes with hoods to cover their faces and warned formerly enslaved people not to show up at the polls.

But in 1870, Congress created the Department of Justice to enable the federal government to protect the right of Black men to vote. Attorney General Amos Akerman oversaw the prosecution of more than 3,000 members of the Ku Klux Klan, winning more than 1,000 convictions. Meanwhile, Congress passed laws to protect Black voting.

Suddenly, it was harder for white southerners to object to Black rights on racial grounds. So they turned to a new argument, one based in economics.

They did not want Black men voting, they said, because formerly enslaved people were poor, and they would vote for leaders who promised to build things such as roads and hospitals. Those public investments could be paid for only with tax levies, and most of the people in the South with property after the war were white. Thus, although the infrastructure in which the southern legislatures were investing would help everyone, reactionaries claimed that Black voting amounted to a redistribution of wealth from white men to Black people, who wanted something for nothing.

Black voting was, one magazine insisted, “socialism in South Carolina.”

This argument that poor Black workers were dangerous socialists offered justification for former Confederates to block their Black neighbors from the polls, to read them out of American society, and ultimately to lynch them. It’s a peculiarly American version of “socialism,” and it might have been a historical anomaly had a small group of business leaders and southern racists not resurrected it in the 20th century as part of a deliberate effort to destroy the liberal consensus.

After World War II, most Republicans joined Democrats in believing that the federal government had to oversee business regulation, welfare programs, and infrastructure. They knew what businessmen would do to the economy unless they were checked; they had seen people homeless and hungry during the Depression.

And they scoffed at the notion that the New Deal system was a bad idea. They looked around at their homes, at their candy-colored cars that they drove on the new interstate highways built under what was then the biggest public-works project in U.S. history, and at their union-boosted paychecks in a nation with its highest gross domestic production ever, and they dismissed as a radical fringe the people trying to undermine this wildly successful system.

But the federal protection of civil rights added a new element to the liberal consensus that would threaten to tear it apart. Between 1967 and 1977, a North Carolina billboard urged people in “Klan Country” to “help fight Communism & Integration.”

The stagflation of the ’70s pushed middle-class Americans into higher tax brackets just when they needed their income most, and helped spread the sense that white tax dollars were being siphoned off to help racial minorities. As towns and governments tried to make up their declining funds with higher property taxes, angry property owners turned against the government. Republicans courted white workers by painting the Democrats as a party of grievance and special interests who simply wanted to pay off lazy Black supporters, rather than being interested in the good of America as a whole.

In 1976, former California Governor Ronald Reagan ran for president with the story of a “welfare queen” from the South Side of Chicago—code words for “Black”—who lived large on government benefits she stole. “She has 80 names, 30 addresses, 12 Social Security cards and is collecting veteran’s benefits on four non-existing deceased husbands,” Reagan claimed. “And she is collecting Social Security on her cards. She’s got Medicaid, getting food stamps, and she is collecting welfare under each of her names.” There was such a woman, but she was a dangerous criminal rather than a representative welfare recipient. Nonetheless, the story illustrated perfectly the idea that government involvement in the economy handed tax dollars to allegedly  undeserving Black Americans.

Reagan suggested a solution to such corruption. In August 1980, he spoke to voters in Philadelphia, Mississippi, 16 years and just a few miles from where the civil-rights workers James Chaney, Andrew Goodman, and Michael Schwerner had been found murdered by members of the Ku Klux Klan as they registered Black voters during 1964’s Freedom Summer. There, Reagan echoed the former Confederates during Reconstruction: “I believe in states’ rights,” he said.

Reagan’s campaign invited voters to remember a time before Black and brown voices and women began to claim equal rights. His campaign passed out buttons and posters urging voters to “make America great again.”

Voters put Reagan in the White House, where his administration cut taxes and slashed spending on public welfare programs (while pouring money into defense spending, and tripling the national debt). In the name of preventing socialism, those programs began the process of hollowing out the middle class.

In the years since 1981, wealth has moved dramatically upward. And yet, the language that linked socialism and minority voting never ceased to escalate.

Talk hosts such as Rush Limbaugh insisted that socialism was creeping through America at the hands of Black Americans, “feminazis,” and liberals. After its founding in 1996, the Fox News Channel joined the chorus of those who insisted that their political opponents were socialists trying to wreck the country. Republicans insisted that Barack Obama was a full-fledged socialist, and in 2018, Trump’s White House Council of Economic Advisers used the word socialism 144 times in a 72-page report attacking Democratic politicians. Trump’s press release for the report read: “Congressional Democrats Want to Take Money From Hardworking Americans to Fund Failed Socialist Policies.”

There is a long-standing fight over whether support for the modern-day right is about taxes or race. The key is that it is about taxes and race at the same time: Since Reconstruction, white supremacists have argued that minority voting means socialism, and that true Americans stand against both. In recent history, that argument has led Republican-dominated state legislatures to make voting harder for people of color, and to rig the system through gerrymandering. Three years ago it led Trump and his supporters to try to overturn the results of a presidential election to keep their opponents out of power. They believed, and insist they still believe, that they had to destroy the government in order to save it.

This article is adapted from Democracy Awakening: Notes on the State of America.

When Black Hair Becomes a Horror Story

The Atlantic

www.theatlantic.com › culture › archive › 2023 › 09 › the-other-black-girl-black-hair › 675456

In the 1989 surrealist satire Chameleon Street, two Black men bicker after one says that he prefers women with light skin and “good hair.” After being criticized for the comment, the man makes a self-deprecating joke: “I’m a victim, brotha. I’m a victim of 400 years of conditioning. The Man has programmed my conditioning. Even my conditioning has been conditioned.” Nearly a decade later, the rap duo Black Star would sample the dialogue at the beginning of their song “Brown Skin Lady,” which is framed as a rebuke of this pervasive bias against dark skin and kinkier hair, and an ode to an idealized vision of a head-wrap-donning natural woman whose “skin’s the inspiration for cocoa butter.”

Cocoa butter, a popular component of hair and beauty products targeted at Black women, is an essential ingredient in The Other Black Girl, a new Hulu series based on the 2021 office-novel-slash-surreal-thriller by Zakiya Dalila Harris. The story follows Nella Rogers (played by Sinclair Daniel), a 26-year-old assistant at a New York publishing house where almost all of her co-workers are white. One day, the sweet, muted chocolate scent of cocoa butter wafts toward Nella’s cubicle; she’s soon introduced to her cool new Black colleague, Hazel (Ashleigh Murray), who’s just been hired. But Nella’s initial excitement soon transitions into fear as she realizes that something sinister is hiding beneath Hazel’s head wraps. It turns out that Hazel is a member of a group of young, professional Black women who all use a magical hair grease—one that helps deaden the stresses of corporate racism. Hazel, whom the group calls its “Lead Conditioner,” likens it to “CBD for the soul”; her arrival at Wagner Books is a recruiting mission to force the personality-changing pomade onto Nella, so they can add a future book editor to their ranks.

For more than a century, Black writers (and, later, filmmakers) have been sublimating the worst chapters of American history into horror, science fiction, and other speculative works. These genres afford creators the freedom to embellish, reimagine, and comment on social ills by manipulating fear of real phenomena. In the context of horror, disembodied hair—or the wild hair of an unruly character—can elicit particularly visceral reactions. (There’s a reason that one specific image comes to mind when you think of The Ring.) The fraught history of Black hair in the United States provides no shortage of inspiration—not just the way it’s been legally policed, but also the mind-numbing pain of a scalp burn caused by chemical relaxer left in too long, or the headaches that come with tight braids. Taming Black hair can be a haunting endeavor, and works such as The Other Black Girl have used these real-world anxieties as a launchpad for more fantastical stories.

The Hulu adaptation is one of several recent productions that use elements of horror and speculative fiction to dramatize the liabilities of managing Black hair, especially in the workplace. In They Cloned Tyrone, a sci-fi mystery film released earlier this year, the protagonists discover an underground lab where an Afro-sporting white scientist has been conducting behavioral experiments on Black people. To inure Black women to the injustice around them, the nefarious entity has been adding a mind-controlling substance to the chemical relaxers they use to straighten their hair.

A similar plot device appears in the 2020 film Bad Hair, a horror satire set in 1989 Los Angeles, where a production assistant gives in to corporate pressure to ditch her natural Afro-textured hairstyles and get a long, silky weave. With her palatable new tresses, she finally gets considered for the TV hosting gig she’s been working toward for years, but her luck changes when her weave overpowers her—literally—and sets off a bloodthirsty rampage. Or take the 2018 horror-comedy short Hair Wolf, a modern vampire story set in a Black hair salon. Directed by Mariama Diallo (who also directed two episodes of The Other Black Girl), the film follows a white influencer obsessed with Black cultural signifiers who insists on getting “boxer braids”—and whose leeching presence starts changing the appearance of the salon's stylists.

Though these genre works vary in tone and skillfulness, they’re all rooted in the same historical reality: For centuries, Black hair has been surveilled, stigmatized, and even banned from public view by laws such as Louisiana’s 18th-century tignon law, which mandated that Creole women of color cover their hair with a scarf “as a visible sign of belonging to the slave class, whether they were enslaved or not.” After the Civil Rights Act of 1964 banned employment discrimination based on race, Black workers began fighting for their right to wear their natural hair without employer retaliation.

Some of these struggles continue today: Because of their hairstyles, Black students have been dismissed from school activities or barred from walking in graduation ceremonies with their classmates; Black job candidates have had employment offers rescinded. At the same time, some social progress has been achieved at the statehouse: Beginning with California in 2019, the CROWN Act (which stands for “Creating a Respectful and Open World for Natural Hair”) and similar bills have been passed in 23 states, making this form of discrimination illegal. Section I of the California law begins with an acknowledgment that the “history of our nation is riddled with laws and societal norms that equated ‘blackness,’ and the associated physical traits, for example, dark skin, kinky and curly hair to a badge of inferiority, sometimes subject to separate and unequal treatment.”

[Read: When ‘good hair’ hurts]

Hulu’s The Other Black Girl immediately introduces hair as a locus of its characters’ private unease (whereas in the novel, the anesthetizing hair serum isn’t introduced until nearly two-thirds of the way through). In its opening scene, a meek-looking Black woman tries to escape an unseen threat at the Wagner Books office in 1988. As she awaits the elevator in a panic, she reaches through her full, mostly straight hair to scratch her scalp. By the time she makes it onto the subway, she’s rubbed her skin raw, and her fingers emerge from her hair covered in blood. This is the work environment that Nella Rogers, with her Afro and her anxiety, will enter 35 years later—the hunting ground where Hazel will attempt to draw Nella into her cocoa-butter coup.

Hazel, whom the white higher-ups at Wagner seem to love as soon as they meet her, doesn’t look quite like the stereotypical “office pet” Black woman of TV shows past. Hazel sports faux dreadlocks, not straight hair of any kind. They’re often piled high atop her head, a wrap holding them in place. Her styling is decidedly modern, vaguely Afrocentric; she projects the sort of effortlessly chic authenticity that Nella, who keeps her hair in a simple Afro, longs for.

The Other Black Girl is at its best when it treats these differences between Nella and Hazel with humor. Nella’s friend, Malaika (Brittany Adebumola), for instance, is a Rihanna-loving style chameleon who judges Nella’s hair and attire with as much vigor as she questions the eerie plot unfolding at Wagner. While Malaika chaperones Nella at a “hair party” in Hazel’s Harlem brownstone, she tries to figure out what’s in the product that Hazel wants to use to braid Nella’s hair. After Hazel declines to answer, Malaika chastises her gullible friend for going along with the plan. “Girl, I taught you better than that,” Malaika says to Nella. “You are on a hair-care journey, and you’re gonna throw it out the window for some unknown ingredients?”

These comic moments recall the witty asides that peppered the show’s influences, most notably Get Out and Scandal. They’re also particularly engaging because the series is pretty light on thriller elements—and because they don’t feel bogged down by explanation. These scenes suggest that the show trusts its viewers to already know that natural hair care usually really is a journey. They reminded me of a bit in the shape-shifting sketch-comedy series Random Acts of Flyness, whose first season featured an episode in which a white judge sentences an anthropomorphic textured wig for offenses including “general badness,” “a tendency to split ends,” and “criminal damage to a perfectly functional plastic comb.” Spoofs like that sketch are especially refreshing because they know how exhausting such conversations about “good hair” can be. The sketch addresses a painful, sometimes dangerous form of discrimination, but the absurdity of its visuals and the confidence of its writing keep it feeling inventive.

The Other Black Girl doesn’t quite succeed at threading its disparate styles into one cohesive series. But the end of the season suggests that a second chapter could land with a little more finesse. In the show’s final scenes, when Nella seems to have acquiesced to the cocoa-butter conspiracy, we see her at Wagner rocking a long, silky black wig. Her co-workers are in awe of the newly minted editor’s empowered disposition, but behind the closed door of her fancy solo office, Nella smirks slyly. She’s in on the secret now, and she’s going to have some fun. What she’ll do as an undercover Conditioner is anybody’s guess.

How We Got ‘Democracy Dies in Darkness’

The Atlantic

www.theatlantic.com › magazine › archive › 2023 › 11 › washington-post-editor-journalism-covering-trump › 675438

This story seems to be about:

I should not have been surprised, but I still marveled at just how little it took to get under the skin of President Donald Trump and his allies. By February 2019, I had been the executive editor of The Washington Post for six years. That month, the newspaper aired a one-minute Super Bowl ad, with a voice-over by Tom Hanks, championing the role of a free press, commemorating journalists killed and captured, and concluding with the Post’s logo and the message “Democracy dies in darkness.” The ad highlighted the strong and often courageous work done by journalists at the Post and elsewhere—including by Fox News’s Bret Baier—because we were striving to signal that this wasn’t just about us and wasn’t a political statement.

“There’s someone to gather the facts,” Hanks said in the ad. “To bring you the story. No matter the cost. Because knowing empowers us. Knowing helps us decide. Knowing keeps us free.”

Even that simple, foundational idea of democracy was a step too far for the Trump clan. The president’s son Donald Trump Jr. couldn’t contain himself. “You know how MSM journalists could avoid having to spend millions on a #superbowl commercial to gain some undeserved credibility?” he tweeted with typical two-bit belligerence. “How about report the news and not their leftist BS for a change.”

Two years earlier—a month into Trump’s presidency—the Post had affixed “Democracy dies in darkness” under its nameplate on the printed newspaper, as well as at the top of its website and on everything it produced. As the newspaper’s owner, Jeff Bezos, envisioned it, this was not a slogan but a “mission statement.” And it was not about Trump, although his allies took it to be. Producing a mission statement had been in the works for two years before Trump took office. That it emerged when it did is testimony to the tortuous, and torturous, process of coming up with something sufficiently memorable and meaningful that Bezos would bless.

Bezos, the founder and now executive chair of Amazon, had bought The Washington Post in 2013. In early 2015, he had expressed his wish for a phrase that might encapsulate the newspaper’s purpose: a phrase that would convey an idea, not a product; fit nicely on a T-shirt; make a claim uniquely ours, given our heritage and our base in the nation’s capital; and be both aspirational and disruptive. “Not a paper I want to subscribe to,” as Bezos put it, but rather “an idea I want to belong to.” The idea: We love this country, so we hold it accountable.

No small order, coming up with the right phrase. And Bezos was no distant observer. “On this topic,” he told us, “I’d like to see all the sausage-making. Don’t worry about whether it’s a good use of my time.” Bezos, so fixated on metrics in other contexts, now advised ditching them. “I just think we’re going to have to use gut and intuition.” And he insisted that the chosen words recognize our “historic mission,” not a new one. “We don’t have to be afraid of the democracy word,” he said; it’s “the thing that makes the Post unique.”

Staff teams were assembled. Months of meetings were held. Frustrations deepened. Outside branding consultants were retained, to no avail. (“Typical,” Bezos said.) Desperation led to a long list of options, venturing into the inane. The ideas totaled at least 1,000: “A bias for truth,” “Know,” “A right to know,” “You have a right to know,” “Unstoppable journalism,” “The power is yours,” “Power read,” “Relentless pursuit of the truth,” “The facts matter,” “It’s about America,” “Spotlight on democracy,” “Democracy matters,” “A light on the nation,” “Democracy lives in light,” “Democracy takes work. We’ll do our part,” “The news democracy needs,” “Toward a more perfect union” (rejected lest it summon thoughts of our own workforce union).

By September 2016, an impatient Bezos was forcing the issue. We had to settle on something. Nine Post executives and Bezos met in a private room at the Four Seasons in Georgetown to finally get over the finish line. Because of Bezos’s tight schedule, we had only half an hour, starting at 7:45 a.m. A handful of options remained on the table: “A bright light for a free people” or, simply, “A bright light for free people”; “The story must be told” (recalling the inspiring words of the late photographer Michel du Cille); “To challenge and inform”; “For a world that demands to know”; “For people who demand to know.” None of those passed muster.

In the end, we settled on “A free people demand to know” (subject to a grammar check by our copy desk, which gave its assent). Success was short-lived—mercifully, no doubt. Late that evening, Bezos dispatched an email in the “not what you’re hoping for category,” as he put it. He had run our consensus pick by his then-wife, MacKenzie Scott, a novelist and “my in-house wordsmith,” who had pronounced the phrase clunky. “Frankenslogan” was the word she used.

By then, we needed Bezos to take unilateral action. Finally, he did. “Let’s go with ‘Democracy dies in darkness,’ ” he decreed. It had been on our list from the start, and was a phrase Bezos had used previously in speaking of the Post’s mission; he himself had heard it from the Washington Post legend Bob Woodward. It was a twist on a phrase in a 2002 ruling by the federal-appellate-court judge Damon J. Keith, who wrote that “democracies die behind closed doors.”

“Democracy dies in darkness” made its debut, without announcement, in mid-February 2017. And I’ve never seen a slogan—I mean, mission statement—get such a reaction. It even drew attention from People’s Daily in China, which tweeted, “ ‘Democracy dies in darkness’ @washingtonpost puts on new slogan, on the same day @realDonaldTrump calls media as the enemy of Americans.” Merriam-Webster reported a sudden surge in searches for the word democracy. The Late Show host Stephen Colbert joked that some of the rejected phrases had included “No, you shut up” and “We took down Nixon—who wants next?” Twitter commentators remarked on the Post’s “new goth vibe.” The media critic Jack Shafer tweeted a handful of his own “rejected Washington Post mottos,” among them “We’re really full of ourselves” and “Democracy Gets Sunburned If It Doesn’t Use Sunscreen.”

Bezos couldn’t have been more thrilled. The mission statement was getting noticed. “It’s a good sign when you’re the subject of satire,” he said a couple of weeks later. The four words atop our journalism had certainly drawn attention to our mission. Much worse would have been a collective shrug. Like others at the Post, I had questioned the wisdom of branding all our work with death and darkness. All I could think of at that point, though, was the Serenity Prayer: “God grant me the serenity to accept the things I cannot change.”

But the phrase stuck with readers, who saw it as perfect for the Trump era, even if that was not its intent.

The Post’s publisher, Fred Ryan, speaks to the newsroom as the staff celebrates winning a Pulitzer Prize in 2016. (Chip Somodevilla / Getty)

We must have been an odd-looking group, sitting around the dining-room table in the egg-shaped Blue Room of the White House: Bezos, recognizable anywhere by his bald head, short stature, booming laugh, and radiant intensity; Fred Ryan, the Post’s publisher, an alumnus of the Reagan administration who was a head taller than my own 5 feet 11 inches, with graying blond hair and a giant, glistening smile; the editorial-page editor, Fred Hiatt, a 36-year Post veteran and former foreign correspondent with an earnest, bookish look; and me, with a trimmed gray beard, woolly head of hair, and what was invariably described as a dour and taciturn demeanor.

Five months after his inauguration, President Trump had responded to a request from the publisher for a meeting, and had invited us to dinner. We were joined by the first lady, Melania Trump, and Trump’s son-in-law and senior adviser, Jared Kushner. By coincidence, just as we were sitting down, at 7 p.m., the Post published a report that Special Counsel Robert Mueller was inquiring into Kushner’s business dealings in Russia, part of Mueller’s investigation into that country’s interference in the 2016 election. The story followed another by the Post revealing that Kushner had met secretly with the Russian ambassador, Sergey Kislyak, and had proposed that a Russian diplomatic post be used to provide a secure communications line between Trump officials and the Kremlin. The Post had reported as well that Kushner met later with Sergey Gorkov, the head of a Russian-owned development bank.

Hope Hicks, a young Trump aide, handed Kushner her phone. Our news alert had just gone out, reaching millions of mobile devices, including hers. “Very Shakespearean,” she whispered to Kushner. “Dining with your enemies.” Hiatt, who had overheard, whispered back, “We’re not your enemies.”

[Read: Trump’s war against the media isn’t a war]

As we dined on cheese soufflé, pan-roasted Dover sole, and chocolate-cream tart, Trump crowed about his election victory, mocked his rivals and even people in his own orbit, boasted of imagined accomplishments, calculated how he could win yet again in four years, and described The Washington Post as the worst of all media outlets, with The New York Times just behind us in his ranking in that moment.

Trump, his family, and his team had put the Post on their enemies list, and nothing was going to change anyone’s mind. We had been neither servile nor sycophantic toward Trump, and we weren’t going to be. Our job was to report aggressively on the president and to hold his administration, like all others, to account. In the mind of the president and those around him, that made us the opposition.

There was political benefit to Trump in going further: We were not just his enemy—we were the country’s enemy. In his telling, we were traitors. Less than a month into his presidency, Trump had denounced the press as “the enemy of the American People” on Twitter. It was an ominous echo of the phrase “enemy of the people,” invoked by Joseph Stalin, Mao Zedong, and Hitler’s propagandist, Joseph Goebbels, and deployed for the purpose of repression and murder. Trump could not have cared less about the history of such incendiary language or how it might incite physical attacks on journalists.

Whenever I was asked about Trump’s rhetoric, my own response was straightforward: “We are not at war with the administration. We are at work.” But it was clear that Trump saw all of us at that table as his foes, most especially Bezos, because he owned the Post and, in Trump’s mind, was pulling the strings—or could pull them if he wished.

At our dinner, Trump sought at times to be charming. It was a superficial charm, without warmth or authenticity. He did almost all the talking. We scarcely said a word, and I said the least, out of discomfort at being there and seeking to avoid any confrontation with him over our coverage. Anything I said could set him off.

He let loose on a long list of perceived enemies and slights: The chief executive of Macy’s was a “coward” for pulling Trump products from store shelves in reaction to Trump’s remarks portraying Mexican immigrants as rapists; he would have been picketed by only “20 Mexicans. Who cares?” Trump had better relations with foreign leaders than former President Barack Obama, who was lazy and never called them. Obama had left disasters around the world for him to solve. Obama had been hesitant to allow the military to kill people in Afghanistan. He, Trump, told the military to just do it; don’t ask for permission. Mueller, Attorney General Jeff Sessions, fired FBI Director James Comey, and FBI Deputy Director Andrew McCabe were slammed for reasons that are now familiar.

Two themes stayed with me from that dinner. First, Trump would govern primarily to retain the support of his base. At the table, he pulled a sheet of paper from his jacket pocket. The figure “47%” appeared above his photo. “This is the latest Rasmussen poll. I can win with that.” The message was clear: That level of support, if he held key states, was all he needed to secure a second term. What other voters thought of him, he seemed to say, would not matter.

Second, his list of grievances appeared limitless. Atop them all was the press, and atop the press was the Post. During dinner, he derided what he had been hearing about our story on the special counsel and his son-in-law, suggesting incorrectly that it alleged money laundering. “He’s a good kid,” he said of Kushner, who at the time was 36 and a father of three, and sitting right there at the table. The Post was awful, Trump said repeatedly. We treated him unfairly. With every such utterance, he poked me in the shoulder with his left elbow.

Baron’s office at the Post. (The Washington Post / Getty)

A few times during that dinner, Trump—for all the shots he had taken during the campaign at Bezos’s company—mentioned that Melania was a big Amazon shopper, prompting Bezos to joke at one point, “Consider me your personal customer-service rep.” Trump’s concern, of course, wasn’t Amazon’s delivery. He wanted Bezos to deliver him from the Post’s coverage.

The effort quickened the next day. Kushner called Fred Ryan in the morning to get his read on how the dinner had gone. After Ryan offered thanks for their generosity and graciousness with their time, Kushner inquired whether the Post’s coverage would now improve as a result. Ryan diplomatically rebuffed him with a reminder that there were to be no expectations about coverage. “It’s not a dial we have to turn one way to make it better and another way to make it worse,” he said.

Trump would be the one to call Bezos’s cellphone that same morning at eight, urging him to get the Post to be “more fair to me.” He said, “I don’t know if you get involved in the newsroom, but I’m sure you do to some degree.” Bezos replied that he didn’t and then delivered a line he’d been prepared to say at the dinner itself if Trump had leaned on him then: “It’s really not appropriate to … I’d feel really bad about it my whole life if I did.” The call ended without bullying about Amazon but with an invitation for Bezos to seek a favor. “If there’s anything I can do for you,” Trump said.

Three days later, the bullying began. Leaders of the technology sector gathered at the White House for a meeting of the American Technology Council, which had been created by executive order a month earlier. Trump briefly pulled Bezos aside to complain bitterly about the Post’s coverage. The dinner, he said, was apparently a wasted two and a half hours.

Then, later in the year, four days after Christmas, Trump in a tweet called for the Postal Service to charge Amazon “MUCH MORE” for package deliveries, claiming that Amazon’s rates were a rip-off of American taxpayers. The following year, he attempted to intervene to obstruct Amazon in its pursuit of a $10 billion cloud-computing contract from the Defense Department. Bezos was to be punished for not reining in the Post.

Meanwhile, Trump was salivating to have an antitrust case filed against Amazon. The hedge-fund titan Leon Cooperman revealed in a CNBC interview that Trump had asked him twice at a White House dinner that summer whether Amazon was a monopoly. On July 24, 2017, Trump tweeted, “Is Fake News Washington Post being used as a lobbyist weapon against Congress to keep Politicians from looking into Amazon no-tax monopoly?”

As Trump sought to tighten the screws, Bezos made plain that the paper had no need to fear that he might capitulate. In March 2018, as we concluded one of our business meetings, Bezos offered some parting words: “You may have noticed that Trump keeps tweeting about us.” The remark was met with silence. “Or maybe you haven’t noticed!” Bezos joked. He wanted to reinforce a statement I had publicly made before. “We are not at war with them,” Bezos said. “They may be at war with us. We just need to do the work.” In July of that year, he once again spoke up unprompted at a business meeting. “Do not worry about me,” he said. “Just do the work. And I’ve got your back.”

A huge advantage of Bezos’s ownership was that he had his eye on a long time horizon. In Texas, he was building a “10,000-year clock” in a hollowed-out mountain—intended as a symbol, he explained, of long-term thinking. He often spoke of what the business or the landscape might look like in “20 years.” When I first heard that timeline, I was startled. News executives I’d dealt with routinely spoke, at best, of next year—and, at worst, next quarter. Even so, Bezos also made decisions at a speed that was unprecedented in my experience. He personally owned 100 percent of the company. He didn’t need to consult anyone. Whatever he spent came directly out of his bank account.

[From the November 2019 issue: Franklin Foer on Jeff Bezos’s master plan]

In my interactions with him, Bezos showed integrity and spine. Early in his ownership, he displayed an intuitive appreciation that an ethical compass for the Post was inseparable from its business success. There was much about Bezos and Amazon that the Post needed to vigorously cover and investigate—such as his company’s escalating market power, its heavy-handed labor practices, and the ramifications for individual privacy of its voracious data collection. There was also the announcement that Bezos and MacKenzie Scott were seeking a divorce—followed immediately by an explosive report in the National Enquirer disclosing that Bezos had been involved in a long-running extramarital relationship with Lauren Sánchez, a former TV reporter and news anchor. We were determined to fulfill our journalistic obligations with complete independence, and did so without restriction.

I came to like the Post’s owner as a human being and found him to be a far more complex, thoughtful, and agreeable character than routinely portrayed. He can be startlingly easy to talk to: Just block out any thought of his net worth. Our meetings took place typically every two weeks by teleconference, and only rarely in person. During the pandemic, we were subjected to Amazon’s exasperatingly inferior videoconferencing system, called Chime. The one-hour meetings were a lesson in his unconventional thinking, wry humor (“This is me enthusiastic. Sometimes it’s hard to tell”), and fantastic aphorisms: “Most people start building before they know what they’re building”; “The things that everybody knows are going to work, everybody is already doing.” At one session, we were discussing group subscriptions for college students. Bezos wanted to know the size of the market. As we all started to Google, Bezos interjected, “Hey, why don’t we try this? Alexa, how many college students are there in the United States?” (Alexa pulled up the data from the National Center for Education Statistics.)

In conversation, Bezos could be witty and self-deprecating (“Nothing makes me feel dumber than a New Yorker cartoon”), laughed easily, and posed penetrating questions. When a Post staffer asked him whether he’d join the crew of his space company, Blue Origin, on one of its early launches, he said he wasn’t sure. “Why don’t you wait a while and see how things go?” I advised. “That,” he said, “is the nicest thing you’ve ever said about me.”

Science fiction—particularly Isaac Asimov, Robert Heinlein, Larry Niven—had a huge influence on Bezos in his teenage years. He has spoken of how his interest in space goes back to his childhood love of the Star Trek TV series. Star Trek inspired both the voice-activated Alexa and the name of his holding company, Zefram, drawn from the fictional character Zefram Cochrane, who developed “warp drive,” a technology that allowed space travel at faster-than-light speeds. “The reason he’s earning so much money,” his high-school girlfriend, Ursula Werner, said early in Amazon’s history, “is to get to outer space.”

Baron and the Post’s owner, Jeff Bezos, in 2016 (The Washington Post / Getty)

From the moment Bezos acquired the Post, he made clear that its historic journalistic mission was at the core of its business. I had been in journalism long enough to witness some executives—unmoored by crushing pressures on circulation, advertising, and profits—abandon the foundational journalistic culture, even shunning the vocabulary we use to describe our work. Many publishers took to calling journalism “content,” a term so hollow that I sarcastically advised substituting “stuff.” Journalists were recategorized as “content producers,” top editors retitled “chief content officers.” Bezos was a different breed.

He seemed to value and enjoy encounters with the news staff in small groups, even if they were infrequent. Once, at a dinner with some of the Post’s Pulitzer Prize winners, Bezos asked Carol Leonnig, who had won for exposing security lapses by the Secret Service, how she was able to get people to talk to her when the risks for them were so high. It had to be a subject of understandable curiosity for the head of Amazon, a company that routinely rebuffed reporters’ inquiries with “No comment.” Carol told him she was straightforward about what she sought and directly addressed individuals’ fears and motivations. The Post’s reputation for serious, careful investigative reporting, she told Bezos, carried a lot of weight with potential sources. They wanted injustice or malfeasance revealed, and we needed their help. The Post would protect their identity.

Anonymous leaking out of the government didn’t begin with the Trump administration. It has a long tradition in Washington. Leaks are often the only way for journalists to learn and report what is happening behind the scenes. If sources come forward publicly, they risk being fired, demoted, sidelined, or even prosecuted. The risks were heightened with a vengeful Trump targeting the so-called deep state, what he imagined to be influential government officials conspiring against him. The Department of Justice had announced early in his term that it would become even more aggressive in its search for leakers of classified national-security information. And Trump’s allies and supporters could be counted on to make life a nightmare for anyone who crossed him.

Journalists would much prefer to have government sources on the record, but anonymity has become an inextricable feature of Washington reporting. Though Trump-administration officials claimed to be unjust victims of anonymous sourcing, they were skillful practitioners and beneficiaries as well. The Trump administration was the leakiest in memory. Senior officials leaked regularly, typically as a result of internal rivalries. Trump himself leaked to get news out in a way that he viewed as helpful, just as he had done as a private citizen in New York.

Trump had assembled his government haphazardly, enlisting many individuals who had no relevant experience and no history of previously collaborating with one another—“kind of a crowd of misfit toys,” as Josh Dawsey, a White House reporter for the Post, put it to me. Some were mere opportunists. Many officials, as the Post’s Ashley Parker has observed, came to believe that working in the administration was like being a character in Game of Thrones : Better to knife others before you got knifed yourself. Odds were high that Trump would do the stabbing someday on his own. But many in government leaked out of principle. They were astonished to see the norms of governance and democracy being violated—and by the pervasive lying.

Trump’s gripes about anonymity weren’t based on the rigor of the reporting—or even, for that matter, its veracity. Leaks that reflected poorly on him were condemned as false, and the sources therefore nonexistent, even as he pressed for investigations to identify the supposedly nonexistent sources. With his followers’ distrust of the media, he had little trouble convincing them that the stories were fabrications by media out to get him—and them. Conflating his political self-interest with the public interest, he was prone to labeling the leaks as treasonous.

At the Post, the aim was to get at the facts, no matter the obstacles Trump and his allies put in our way. In January 2018, Dawsey reported that Trump, during a discussion with lawmakers about protecting immigrants from Haiti, El Salvador, and African countries as part of an immigration deal, asked: “Why are we having all these people from shithole countries come here?” In March, Dawsey, Leonnig, and David Nakamura reported that Trump had defied cautions from his national security advisers not to offer well-wishes to Russian President Vladimir Putin on winning reelection to another six-year term. “DO NOT CONGRATULATE,” warned briefing material that Trump may or may not have read. Such advice should have been unnecessary in the first place. After all, it had been anything but a fair election. Prominent opponents were excluded from the ballot, and much of the Russian news media are controlled by the state. “If this story is accurate, that means someone leaked the president’s briefing papers,” said a senior White House official who, as was common in an administration that condemned anonymous sources, insisted on anonymity.

To be sure, sources sometimes want anonymity for ignoble reasons. But providing anonymity is essential to legitimate news-gathering in the public interest. If any doubt remains as to why so many government officials require anonymity to come forward—and why responsible news outlets give them anonymity when necessary—the story of Trump’s famous phone call with Ukrainian President Volodymyr Zelensky offers an instructive case study.

In September 2019, congressional committees received a letter from Michael Atkinson, the inspector general for the intelligence community. A whistleblower had filed a complaint with him, he wrote, and in Atkinson’s assessment, it qualified as credible and a matter of “urgent concern”—defined as a “serious or flagrant problem, abuse or violation of the law or Executive Order” that involves classified information but “does not include differences of opinion concerning public policy matters.”

Soon, a trio of Post national-security reporters published a story that began to flesh out the contents of the whistleblower complaint. The article, written by Ellen Nakashima, Greg Miller, and Shane Harris, cited anonymous sources in reporting that the complaint involved “President Trump’s communications with a foreign leader.” The incident was said to revolve around a phone call.

Step by careful step, news organizations excavated the basic facts: In a phone call with Zelensky, Trump had effectively agreed to provide $250 million in military aid to Ukraine—approved by Congress, but inexplicably put on hold by the administration—only if Zelensky launched an investigation into his likely Democratic foe in the 2020 election, Joe Biden, and his alleged activities in Ukraine. This attempted extortion would lead directly to Trump’s impeachment, making him only the third president in American history to be formally accused by the House of Representatives of high crimes and misdemeanors.

The entire universe of Trump allies endeavored to have the whistleblower’s identity revealed—widely circulating a name—with the spiteful aim of subjecting that individual to fierce harassment and intimidation, or worse. Others who ultimately went public with their concerns, as they responded to congressional subpoenas and provided sworn testimony, became targets of relentless attacks and mockery.

Lieutenant Colonel Alexander Vindman of the National Security Council, who had listened in on the phone call as part of his job, became a central witness, implicating Trump during the impeachment hearings. He was fired after having endured condemnation from the White House and deceitful insinuations by Trump allies that he might be a double agent. Vindman’s twin brother, Yevgeny, an NSC staffer who had raised protests internally about Trump’s phone call with Zelensky, was fired too. Gordon Sondland—the hotelier and Trump donor who was the ambassador to the European Union and an emissary of sorts to Ukraine as well—was also fired. He had admitted in congressional testimony that there had been an explicit quid pro quo conditioning a Zelensky visit to the White House on a Ukrainian investigation of Biden. The Vindmans and Sondland were all dismissed within two days of Trump’s acquittal in his first impeachment trial. Just before their ousters, White House Press Secretary Stephanie Grisham had suggested on Fox News that “people should pay” for what Trump went through.

The acting Pentagon comptroller, Elaine McCusker, had her promotion rescinded, evidently for having merely questioned whether Ukraine aid could be legally withheld. She later resigned. Atkinson, the intelligence community’s inspector general, was fired as well, leaving with a plea for whistleblowers to “use authorized channels to bravely speak up—there is no disgrace for doing so.”

“The Washington Post is constantly quoting ‘anonymous sources’ that do not exist,” Trump had tweeted in 2018 in one of his familiar lines of attack. “Rarely do they use the name of anyone because there is no one to give them the kind of negative quote that they are looking for.” The Ukraine episode made it clear that real people with incriminating information existed in substantial numbers. If they went public, they risked unemployment. If they chose anonymity, as the whistleblower did, Trump and his allies would aim to expose them and have them publicly and savagely denounced.

“We are not at war with the administration. We are at work.” When I made that comment, many fellow journalists enthusiastically embraced the idea that we should not think of ourselves as warriors but instead as professionals merely doing our job to keep the public informed. Others came to view that posture as naive: When truth and democracy are under attack, the only proper response is to be more fiercely and unashamedly bellicose ourselves. One outside critic went so far as to label my statement an “atrocity” when, after my retirement, Fred Ryan, the Post’s publisher, had my quote mounted on the wall overlooking the paper’s national desk.

I believe that responsible journalists should be guided by fundamental principles. Among them: We must support and defend democracy. Citizens have a right to self-governance. Without democracy, there can be no independent press, and without an independent press, there can be no democracy. We must work hard and honestly to discover the truth, and we should tell the public unflinchingly what we learn. We should support the right of all citizens to participate in the electoral process without impediment. We should endorse free speech and understand that vigorous debate over policy is essential to democracy. We should favor equitable treatment for everyone, under the law and out of moral obligation, and abundant opportunity for all to attain what they hope for themselves and their families. We owe special attention to the least fortunate in our society, and have a duty to give voice to those who otherwise would not be heard. We must oppose intolerance and hate, and stand against violence, repression, and abuse of power.

I also believe journalists can best honor those ideals by adhering to traditional professional principles. The press will do itself and our democracy no favors if it abandons what have long been bedrock standards. Too many norms of civic discourse have been trampled. For the press to hold power to account today, we will have to maintain standards that demonstrate that we are practicing our craft honorably, thoroughly, and fairly, with an open mind and with a reverence for evidence over our own opinions. In short, we should practice objective journalism.

The idea of objective journalism has uncertain origins. But it can be traced to the early 20th century, in the aftermath of World War I, when democracy seemed imperiled and propaganda had been developed into a polished instrument for manipulating public opinion and the press during warfare—and, in the United States, for deepening suspicions about marginalized people who were then widely regarded as not fully American.

Baron and his Boston Globe colleagues react to winning the 2003 Pulitzer Prize for Public Service for the paper’s coverage of sexual abuse by priests in the Roman Catholic Church. (The Boston Globe / Getty)

The renowned journalist and thinker Walter Lippmann helped give currency to the term when he wrote Liberty and the News, published in 1920. In that slim volume, he described a time that sounds remarkably similar to today. “There is everywhere an increasingly angry disillusionment about the press, a growing sense of being baffled and misled,” he wrote. The onslaught of news was “helter-skelter, in inconceivable confusion.” The public suffered from “no rules of evidence.” He worried over democratic institutions being pushed off their foundations by the media environment.

[From the December 1919 issue: Walter Lippmann’s “Liberty and the News”]

Lippmann made no assumption that journalists could be freed of their own opinions. He assumed, in fact, just the opposite: They were as subject to biases as anyone else. He proposed an “objective” method for moving beyond them: Journalists should pursue “as impartial an investigation of the facts as is humanly possible.” That idea of objectivity doesn’t preclude the lie-detector role for the press; it argues for it. It is not an idea that fosters prejudice; it labors against it. “I am convinced,” he wrote, in a line that mirrors my own thinking, “that we shall accomplish more by fighting for truth than by fighting for our theories.”

In championing “objectivity” in our work, I am swimming against what has become, lamentably, a mighty tide in my profession of nearly half a century. No word seems more unpopular today among many mainstream journalists. A report in January 2023 by a previous executive editor at The Washington Post, Leonard Downie Jr., and a former CBS News president, Andrew Heyward, argued that objectivity in journalism is outmoded. They quoted a former close colleague of mine: “Objectivity has got to go.”

Objectivity, in my view, has got to stay. Maintaining that standard does not guarantee the public’s confidence. But it increases the odds that journalists will earn it. The principle of objectivity has been under siege for years, but perhaps never more ferociously than during Trump’s presidency and its aftermath. Several arguments are leveled against it by my fellow journalists: None of us can honestly claim to be objective, and we shouldn’t profess to be. We all have our opinions. Objectivity also is seen as just another word for neutrality, balance, and so-called both-sidesism. It pretends, according to this view, that all assertions deserve equal weight, even when the evidence shows they don’t, and so it fails to deliver the plain truth to the public. Finally, critics argue that objectivity historically excluded the perspectives of those who have long been among the most marginalized in society (and media): women, Black Americans, Latinos, Asian Americans, Indigenous Americans, the LGBTQ community, and others.

Genuine objectivity, however, does not mean any of that. This is what it really means: As journalists, we can never stop obsessing over how to get at the truth—or, to use a less lofty term, “objective reality.” Doing that requires an open mind and a rigorous method. We must be more impressed by what we don’t know than by what we know, or think we know.

[Darrell Hartman: The invention of objectivity]

Journalists routinely expect objectivity from others. Like everyone else, we want objective judges. We want objective juries. We want police officers to be objective when they make arrests and detectives to be objective in assessing evidence. We want prosecutors to evaluate cases objectively, with no prejudice or preexisting agendas. Without objectivity, there can be no equity in law enforcement, as abhorrent abuses have demonstrated all too often. We want doctors to be objective in diagnosing the medical conditions of their patients, uncontaminated by bigotry or baseless hunches. We want medical researchers and regulators to be objective in determining whether new drugs might work and can be safely consumed. We want scientists to be objective in evaluating the impact of chemicals in the soil, air, and water.

Objectivity in all these fields, and others, gets no argument from journalists. We accept it, even insist on it by seeking to expose transgressions. Journalists should insist on it for ourselves as well.

This article was adapted from Martin Baron’s book, Collision of Power: Trump, Bezos, and The Washington Post, which will be published in October 2023. It appears in the November 2023 print edition with the headline “We Are Not at War. We Are at Work.”

The Coming Attack on an Essential Element of Women’s Freedom

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 09 › no-fault-divorce-laws-republicans-repeal › 675371

For the past half century, many women in America have enjoyed an unprecedented degree of freedom and legal protection, not because of Roe v. Wade or antidiscrimination laws but because of something much less celebrated: “no fault” divorce. Beginning in the early 1970s, no-fault divorce enabled millions of people, most of them women, to file for divorce over “irreconcilable differences” or the equivalent without having to prove misconduct by a spouse—such as adultery, domestic violence, bigamy, cruelty, abandonment, or impotence.

But now conservative politicians in states such as Texas and Louisiana, as well as a devoutly Catholic husband who tried to halt his wife’s divorce efforts in Nebraska, are attacking no-fault divorce. One of the more alarming steps taken in that direction came from the Texas Republican Party, whose 2022 platform called on the legislature to “rescind unilateral no-fault divorce laws and support covenant marriage.” Given the Republican Party’s control of the offices of governor, secretary of state, and attorney general, and both chambers of the state legislature, Texas has a chance of actually doing it.

Until 1857, divorce in England—whose ecclesiastical laws formed the basis of divorce laws in most American colonies outside New England—was available only through an act of Parliament. A total of 324 couples managed to secure one; only four of those were initiated by women. Husbands could divorce their wives based solely on adultery, but women had to prove additional aggravating circumstances. Proof of brutality, rape, or desertion was considered insufficient to support a divorce. Not until 1801 did a woman, Jane Addison, finally win a divorce based on adultery alone.

[Helen Lewis: The conservative case for liberalizing divorce]

Divorce in the American colonies was often decided by governors, while colonial courts required the innocent spouse to prove marital fault by the other, making divorce virtually nonexistent. Married women were mostly bound by laws of “coverture,” which, in the words of the English jurist William Blackstone, meant that “by marriage, the husband and wife are one person in the law: that is, the very being or legal existence of the woman is suspended during the marriage, or at least is incorporated and consolidated into that of the husband: under whose wing, protection, and cover, she performs every thing.” As recounted by the historian Catherine Allgor, American women had no right to enter into contracts or independently own property, including their own wages and “the clothes on their backs.” Mothers lacked basic parental rights, too, “so that if a wife divorced or left a husband, she would not see her children again.”

State standards for divorce varied, including the number of times a man could assault his wife before divorce was allowed. (Marital rape was not illegal in all 50 states until 1993.) In 1861, a judge in New York City ruled that “one or two acts of cruel treatment” were not sufficient grounds to grant a woman a divorce, even after her husband beat her unconscious with a piece of wood during a fight over the family dog sleeping in their bed. The judge wrote that “the wife should not seek on slight provocation to dissolve that sacred tie which binds her to her husband for life, for better or worse.” As if the privacy intrusions of a trial were not enough, newspapers routinely publicized divorce cases, often blaming the woman without mentioning her abuse. Norms of “regular marriage” even made their way into national politics when, two months before the Civil War began, President Abraham Lincoln invoked the analogy in a speech accusing the South of wanting a “‘free love’ arrangement” based on “passional attraction” rather than fidelity to the Union.

Against this backdrop, conservative commentators today claim that no-fault-divorce laws destroy the sanctity of marriage and disfavor men. The blogger and Daily Wire host Matt Walsh tweeted this year that no-fault divorce should be abolished. He once tweeted that “no fault divorce grants one person the ability to break the contract without the consent of the other. What kind of contract is that?” The right-wing YouTube personality Steven Crowder has argued that “no-fault divorce … means that in many of these states if a woman cheats on you, she leaves, she takes half. So it’s not no-fault, it’s the fault of the man.” Elsewhere, he claimed, “If you’re a woman that comes from meager means, and you want to get wealthy—you’ve never worked, you didn’t get a degree, you have no skill set, but you’re good-looking—your best path to victory is simply to marry a man, leave him, and take half.”

Republican Senator J. D. Vance of Ohio picked up the argument on the campaign trail last September, stating, “One of the great tricks that I think the sexual revolution pulled on the American populace … is the idea that, like, ‘Well, okay, these marriages were fundamentally, you know, they were maybe even violent, but certainly they were unhappy. And so getting rid of them and making it easier for people to shift spouses like they change their underwear, that’s going to make people happier in the long term.’”

[Olga Khazan: The high cost of divorce]

Except no-fault-divorce laws did make women happier. Prior to California’s Family Law Act of 1969, which was signed into law by then-Governor Ronald Reagan, all states followed a fault-based system in which divorces were granted very sparingly under strict criteria. Women who wanted out of a bad marriage had little choice but to stay, because most were family caregivers who would wind up destitute without a judicial division of assets. The tight legal controls also led to highly adversarial proceedings and regularized lying in order to secure a divorce decree. Estranged couples fled to more liberal states known as “divorce colonies” simply to end a marriage. It was not until 1949 that divorce was legal at all in South Carolina. Although many states still retain the option of fault-based grounds for divorce, which arguably can carry the benefits of avoiding mandatory separation periods and a greater share of marital assets for the spouse who files for divorce, the last to abandon mandatory proof of fault was New York, in 2010. Late-stage opponents responsible for New York’s delay in the movement included the Roman Catholic Church and some women’s-rights groups fearful that no-fault divorce would diminish women’s leverage to obtain favorable alimony or child-support awards.

No-fault divorce managed to meaningfully shift the power balance in marriage relationships: Women now had the option of leaving without their husband’s permission. From 1976 to 1985, states that adopted no-fault divorce saw their overall domestic-violence rates plummet by a quarter to one-half, including in relationships that did not end in divorce. The number of women murdered by “intimates” declined by 10 percent. Female suicide rates also fell immediately in states that moved to unilateral divorce, a downward trend that continued for the next decade. Researchers have theorized that many women “derive a life-preserving benefit from divorce,” because under the threat of divorce, “the husband … behaves himself, thereby reducing the incidence of domestic violence and spousal homicide.”

Federal law allows for state legislatures to easily roll back women’s ability to initiate divorce without spousal consent or proof of abuse. Although the Supreme Court recognized in 2015’s Obergefell v. Hodges that state laws must yield to federal rights protecting same-sex marriage, nothing in the Constitution or the Court’s precedent clearly prevents states from reversing no-fault divorce.

The writer and attorney Beverly Willett, an opponent of no-fault divorce, has argued that “unilateral no-fault divorce clearly violates the 14th Amendment,” supposedly depriving defendants in divorce cases “of life, liberty, and property without due process of law.” This argument has it exactly backwards. There is no express “right” to marriage in the Constitution. Although troubling vestiges of legal coverture still linger in American law, women these days are not considered legal “property” to which a man’s constitutional due-process rights could conceivably attach.

As for due-process protections for liberty (which the Supreme Court has described as “not confined to mere freedom from bodily restraint,” but instead inclusive of “the full range of conduct which the individual is free to pursue”), that right more compellingly protects the person seeking to end a marriage—and to do so without having to prove to the government that she deserves it.

The Supreme Court Needs to Make a Call on Trump’s Eligibility

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 09 › supreme-court-needs-make-call-trumps-eligibility › 675416

There’s an old saying that sometimes it is more important for the law to be certain than to be right. Certainty allows people to plan their actions knowing what the rules are going to be.

Nowhere is this principle more urgent than when it comes to the question of whether Donald Trump’s efforts to subvert the 2020 election results have disqualified him from becoming president again. As cases raising the question have begun working their way through the courts in Colorado, Minnesota, and elsewhere, the country needs the Supreme Court to fully resolve the issue as soon as possible.

Eminent constitutional-law scholars and judges, both conservative and liberal, have made strong cases that Trump is disqualified from being president again under Section 3 of the Fourteenth Amendment, which bars from office those who have taken an oath to defend the Constitution and then “engaged in insurrection or rebellion against the same, or given aid or comfort to the enemies thereof.” Some of those scholars are professed originalists—as are many of the Supreme Court’s conservative justices—and to make their cases, they have analyzed what they say is the “original public meaning” of this provision. Other conservative and liberal scholars have concluded otherwise about the clause’s meaning, or at least raised serious doubts about whether and how these provisions apply to Trump.

Among the unresolved issues are whether the disqualification provision applies to those who formerly served as president, rather than in some other office; whether Congress must pass legislation authorizing the Department of Justice to pursue a civil lawsuit in order to bar Trump; whether Trump “engaged” in “insurrection” or “rebellion” or at least gave “aid or comfort” to “the enemies thereof.” Unsurprisingly, given that this provision emerged in response to the Civil War in the 1860s, there is virtually no modern case law fully resolving these issues, and many enormous questions remain on which reasonable minds disagree—for example, who would enforce this provision, and how.

Those are the legal questions. The political questions are, in some ways, even more complicated, and at least as contested. If Trump is disqualified on Fourteenth Amendment grounds, some believe that this would become a regular feature of nasty American politics. Others worry that significant social unrest would result if the leading candidate for one of the country’s major political parties were to be disqualified from running for office rather than giving voters the final say on the issue.

[David Frum: The Fourteenth Amendment fantasy]

All of these questions, however, are somewhat beside the point. This is not merely an academic exercise. Trump, right now, is already being challenged as constitutionally disqualified, and these issues are going to have to be resolved, sooner or later. My point is that sooner is much better than later.

A number of legal doctrines could lead courts to kick this issue down the road for some time. Maybe the provision applies not to primaries, but only to candidates in a general election. Maybe voters don’t have standing to sue, because they can’t show a particularized injury. Maybe this is a political question to be decided by the political branches, such as Congress, rather than by the judiciary.

But courts should not dally, because judicial delay could result in disaster. Imagine this scenario: Election officials and courts take different positions on whether Trump’s name can appear on the ballot in 2024. The Supreme Court refuses to get involved, citing one of these doctrines for avoiding assessing the case’s merits. Trump appears to win in the Electoral College while losing the popular vote. Democrats control Congress, and when January 6, 2025, arrives and it is time to certify the vote, Democrats say that Trump is ineligible to hold office, and he cannot serve.

As I and my co-authors argue in our report on how to have a fair and legitimate election in 2024, such a scenario raises the possibility of major postelection unrest. The country would have one political party disqualifying the candidate of the other party from serving—after that candidate has apparently won the results of a fair election.

The Supreme Court is the only institution that can definitively say what the law is in this case, and it should not wait once a case reaches its doorstep. Think of Republican voters and candidates soon to participate in the primary process. They, and everyone else, deserve to know whether the leading candidate is actually eligible to serve in office.

A Supreme Court decision to disqualify Trump from the ballot would obviate the need for Congress to resolve the question on January 6. Trump would not be allowed to run. In contrast, a judicial decision that Trump is not disqualified would make it very difficult politically for Democrats in Congress to try to reject Trump anyway after a 2024 victory.

[J. Michael Luttig and Laurence H. Tribe: The Constitution prohibits Trump from ever being president again]

How the Supreme Court would—or should—resolve the question of Trump’s disqualification on the merits is far from clear. There is no question that Trump tried to subvert the results of the 2020 election, using pressure, lies, and even the prospect for violence to overturn Joe Biden’s victory. Trump so far has faced no accountability for his actions: The Senate did not muster the two-thirds vote in 2021 to convict him after his second impeachment, a step that could have led to his disqualification under Congress’s impeachment-related powers. The federal and Georgia cases against Trump for his alleged election interference may yet go to trial, but whether verdicts will ever be reached is far from certain. In any event, even a guilty verdict would not disqualify Trump. If there is going to be any accountability for Trump’s actions in 2020, it might have to come from this disqualification provision. A reading of the Fourteenth Amendment in this way helps protect our democracy.

But serious legal questions continue to dog any use of Section 3 of the Fourteenth Amendment. My general view is that to avoid the overall criminalization of politics, reserve prosecuting politicians for instances when both the law and the facts are clear; marginal cases are best left to other remedies. Disqualification, of course, is not a criminal procedure, but borrowing this principle from the criminal context recommends caution here too. In close cases, the voters should get to decide at the ballot box.

The pressure to disqualify Trump is only going to grow until there’s a final resolution of the question. When this issue reaches the Supreme Court, the country will need the Court to decisively resolve it—or risk chaos later on.

Where the New Identity Politics Went Wrong

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 09 › woke-ideology-history-origins-flaws › 675454

In universities and newspapers, nonprofit organizations and even corporations, a new set of ideas about race, gender, and sexual orientation has gained huge influence. Attitudes to these ideas—which are commonly called “woke,” though I prefer a more neutral term, the “identity synthesis”—have split into two camps: those who blame them for all of America’s ills and those who defend them, largely uncritically.

Right-wing polemicists deride these ideas as a form of “cultural Marxism,” which has substituted identity categories such as race for the economic category of class but still aims at the same old goal of communist revolution. They invoke wokeness to oppose anything they dislike, such as sex ed and insufficiently patriotic versions of American history.

On the other side, many people in media and politics claim that wokeness is simply a matter of justice and decency: a willingness to acknowledge the cruelties of America’s past and a recognition of the ways they still shape the country. “Being woke,” Joe Walsh, a former Republican congressman who became a vocal critic of Donald Trump, has said, “just means being empathetic.”

[Adam Serwer: ‘Woke capital’ doesn’t exist]

Each position mischaracterizes these ideas, obscuring their true nature. Over recent decades, writers, activists, and scholars have melded a diverse set of ideas inspired by postmodernism, postcolonialism, and critical race theory into a new worldview that animates today’s progressive movements. It now constitutes a genuinely novel ideology, which has radically transformed what it means to be left-wing.

Amid all the contention, this ideology deserves assessment in a more evenhanded manner, one that weighs what is interesting or potentially useful about its tenets against the ways in which it undercuts the very values it claims to advance. And the key to a more sophisticated understanding and critique of these ideas lies in the story of where they came from.

At the beginning, there was Michel Foucault.

In his early years, the French philosopher was shaped by the fashionable “grand narratives” of his time. When he studied with the Hegelian philosopher Jean Hyppolite, Foucault imbibed the idea that history should be understood as the gradual realization of freedom in the world. When, a few years later, he went on to study with the Marxist thinker Louis Althusser, a passionate defender of the Soviet Union, Foucault embraced the idea that liberation would come in the form of the proletariat staging a worldwide revolution. In 1950, Foucault joined the French Communist Party, which was unquestioningly loyal to Joseph Stalin.

Yet Foucault soon chafed at the Marxist orthodoxy demanded by his comrades, leaving the party by 1953. “Over anyone who pretended to be on the left,” he would later complain, the party “laid down the law. One was either for or against; an ally or an adversary.” He became an adversary.

This combination of a commitment to left-wing ideals and a mistrust of grand narratives that justify coercion, including Marxism, constitutes the core of Foucault’s published work. In book after book, he argued against modern societies’ complacent assumption that they had made progress in the way they punish criminals or treat the mentally ill. Doubting claims to objective truth, Foucault believed that societies had become not more humane but merely more effective at controlling their subjects.

This paved the way for Foucault’s most influential argument, about the true nature of power. Power, he argued, is much more indirect than the top-down model traditionally taught in civics classes. Because real power lies in the normative assumptions embedded in the discourses that structure our society and the identity labels we use to make sense of the world, it is “produced from one moment to the next, at every point.”

This belief made Foucault deeply skeptical about the perfectibility of our social world. People would always chafe against the form that power takes at any given moment in history: “Where there is power, there is resistance,” he wrote. But this resistance, if successful, would itself come to exercise a power of its own. Even the most noble struggle, Foucault warned his readers, would contain within itself the seed of new forms of oppression.

[Thomas Chatterton Williams: You can’t define woke]

Foucault left his devotees with a complicated legacy. On the one hand, they recognized that his philosophy allowed them to question the prevailing assumptions and institutions of their age, including claims to objective truth or universal validity. On the other hand, Foucault’s pessimism about the possibility of creating a less oppressive world disappointed them. As Noam Chomsky told me 50 years after a famous encounter with Foucault for a televised debate at a Dutch university, he had “never seen such an amoral—not immoral, amoral—person in my life.”

In the late 1970s and ’80s, a series of postcolonial thinkers, such as Edward Said and Gayatri Chakravorty Spivak, set out to resolve this tension. Simultaneously influenced by Foucault and uncomfortable with his fatalistic conclusions, their ambition was to infuse the prospect for political agency back into his ideas.

Edward Said, a Palestinian American literary theorist who taught at Columbia University, shot to fame by arguing that the way Western writers had imagined the “Orient” helped them wield power over it, causing real-world harm. Explicitly acknowledging his debt to “Michel Foucault’s notion of a discourse,” he claimed that analyzing the discourse of “Orientalism” was crucial to understanding “the enormously systematic discipline by which European culture was able to manage—and even produce—the Orient politically, sociologically, militarily, ideologically, scientifically, and imaginatively.”

This “discipline” of Oriental Studies cloaked itself as a scholarly tradition that claimed to be politically neutral, even objective. In reality, argued Said, “political imperialism governs an entire field of study, imagination, and scholarly institutions.” Historically, Western representations of the East justified colonial rule. Since then, Said argued, a newer set of ideas about the “Arab mind” had helped motivate U.S. interventions in the Middle East. Said’s goal was to free his readers from the pernicious power Orientalist assumptions still held.

This critique prepared the ground for a more politically engaged adaptation of postmodernism. For many of Orientalism’s readers, it seemed clear that the goal of cultural analysis should be to help those who have the least power. They sought to change the dominant discourse to help the oppressed resist the oppressor.

Postcolonial scholars took Said’s work as a model for how to apply discourse analysis to explicitly political ends. A new wave of researchers concerned with such topics as gender, the media, and the experiences of migrants and ethnic minorities quickly embraced their toolkit. In time, the idea that a lot of political activism might revolve around critiquing dominant discourses or labeling certain cultural artifacts as “problematic” went mainstream, finding currency on social-media platforms and in traditional newspapers.

[Franklin Foer: The new Republican battle cry]

Foucault’s legacy left postcolonial scholars with a second obstacle. In rejecting grand narratives, he had not only turned against the idea of universal values or objective truth; he was also arguing that identity labels such as “women,” “proletarians,” and the “masses of the Third World” were reductive. Such generalizations, he claimed, create the illusion that a hugely varied group of people share some essential set of characteristics; this misperception could even help perpetuate injustices. The oppressed, Foucault observed, do not need intellectuals to speak on their behalf.

Spivak, an Indian literary scholar, strongly disagreed. Parisian philosophes, she argued, could take their social standing for granted. But the people with whom she was most concerned had none of their resources and enjoyed no such recognition. In countries such as India, she concluded in her most celebrated article, the “subaltern” cannot speak.

This presented Spivak, who had made her name as an interpreter of postmodernist philosophers, with a dilemma. How could she stay true to her distrust of dominant discourses, including identity categories, while speaking on behalf of the marginalized groups for which she felt a deep kinship? The key to doing better, she argued, was to embrace identity markers that could prove useful in practice even if they might be suspect in theory. “I think we have to choose again strategically,” she suggested, “not universal discourse but essentialist discourse … I must say I am an essentialist from time to time.”

These cryptic remarks took on a life of their own. Faced with the problem of how to speak for the oppressed, scholars from numerous disciplines followed Spivak’s example. They continued in the spirit of postmodernism to cast doubt on claims of scientific objectivity or universal principles. At the same time, they insisted on using broad identity categories and speaking for the downtrodden by embracing what they came to call “strategic essentialism.”

Over time, Spivak’s paradoxical compromise became a political rallying cry. Today, activists who carefully acknowledge that race or gender or ability status “is a social construct” nevertheless go on to make surprisingly essentializing claims about what, say, brown people or women or the disabled believe and demand.

The embrace of strategic essentialism also helps explain the logic behind the rise of new social customs, such as the establishment of racially separate “affinity groups” in many progressive spaces. Spivak came to believe that a commitment to identity categories such as race was strategically useful. Many progressives took this to mean that activists—and even grade-school students—should be encouraged to conceive of themselves first and foremost in racial terms.

Slowly but surely, these ideas gained traction in different parts of academia, including law schools. A new generation of legal scholars set out to question long-held beliefs about the judiciary, such as the idea that judges made decisions based on fine points of legal doctrine rather than on their own worldview or self-interest. But one member of this emerging tradition who proved especially influential argued that it had a crucial blindspot of its own: race.

Derrick Bell was a Black lawyer who spent the 1960s doing heroic work in the fight against desegregation. As an attorney for the NAACP’s Legal Defense and Educational Fund, his mission was to win compliance with the major judicial victories of the civil-rights era, such as Brown v. Board of Education. In total, he helped oversee some 300 cases involving the desegregation of schools and businesses.

At first, Bell found his work exhilarating. But the longer he stayed in the job, the more dispirited he became. His lawsuits took so long to wind their way through the courts that many of the boys and girls he represented were adults by the time the school they’d hoped to attend was integrated.

Even then, progress could prove illusory. As Black schools were dissolved, many good Black teachers lost their jobs. And as white schools were integrated, many parents chose to send their kids to private schools, or moved out of the neighborhood altogether. In the end, some of the newly “integrated” schools were still predominantly Black and still suffered from a lack of resources.

[Adam Serwer: Trumpism is ‘identity politics’ for white people]

These disappointments transformed Bell’s thinking. By the time his first major scholarly article appeared, in 1976, Bell had come to reject basic assumptions that had underpinned his earlier work as a litigator. Expanding on an argument that—as Bell himself acknowledged—had originally been advanced by segregationists, he warned that civil-rights lawyers, caught between their clients’ wishes and their own ideals, were trying to “serve two masters.”

“Having convinced themselves that Brown stands for desegregation and not education,” Bell complained, “the established civil rights organizations steadfastly refuse to recognize reverses in the school desegregation campaign—reverses which, to some extent, have been precipitated by their rigidity.” Civil-rights lawyers needed instead to listen to their Black clients, Bell said. According to him, that meant becoming more open to creating schools that were (to reappropriate the disingenuous segregationist mantra) more truly “separate but equal.”

Bell’s skepticism about the civil-rights movement also made him distrust the idea that the racial attitudes of most Americans were improving. “Racism,” he contended, is not “a holdover from slavery that the nation both wants to cure and is capable of curing”; rather, it is “an integral, permanent, and indestructible component of this society.” The civil-rights movement might have succeeded in making discrimination “less visible,” but, he wrote in the early 1990s, racism had become “neither less real nor less oppressive.”

According to Bell, the legal remedies implemented during the civil-rights era, such as school desegregation, would never suffice to overcome the legacy of slavery. It was high time, he wrote in a 1992 paper, for a “review and replacement of the now defunct racial equality ideology.” To win lasting progress, Bell proposed, would require more than nominal equality; it would take explicit group rights that compensated the marginalized. He and his followers called for policies that openly distinguished among citizens on the basis of skin color, so that those who had historically been oppressed would henceforth receive preferential treatment.

Bell died in 2011. A decade later, his ideas are enjoying a second life as an avowedly anti-racist left is embracing his call for race-sensitive public policy. The determination to put “racial equity” before old-fashioned forms of “racial equality” is evident today in many public policies, such as when, in the early days of the coronavirus pandemic, the Small Business Administration prioritized nonwhite restaurant owners for emergency relief funds.

Much of today’s progressive politics is a popularized version of what I call the “identity synthesis.” To a remarkable extent, the ideas, norms, and practices that have become so prevalent on social media and in corporate diversity trainings owe a debt to these four thinkers in particular. They are rooted in a deep skepticism about objective truth inspired by Foucault, the use of discourse analysis for explicitly political ends taken from Said, an embrace of essentialist categories of identity derived from Spivak, and a preference for public policies that explicitly tie the treatment a person receives to their group identity, as advocated by Bell. (Kimberlé Crenshaw, the Black feminist legal scholar who coined the idea of “intersectionality,” which has since taken on a life of its own, might be considered another key member of this progressive pantheon.)

The mainstream influence of these ideas makes all the more interesting the fact that several of these thinkers came to have misgivings about the uses to which they were put. Foucault, who died in 1984, would, I suspect, have been quick to remind his devotees that the impulse to reshape discourses for political ends can, despite the liberatory aim, readily morph into new forms of repression.

Said, who died in 2003, addressed the problem explicitly. “Identity,” he wrote shortly before his death, is “as boring a subject as one can imagine.” For that reason, he admonished, “marginality and homelessness are not, in my opinion, to be gloried in; they are to be brought to an end, so that more, and not fewer, people can enjoy the benefits of what has for centuries been denied the victims of race, class, or gender.”

[Graeme Wood: What happens when a carnival barker writes intellectual history]

Spivak, too, was forthright about her dismay at how the idea of strategic essentialism had helped forge a new ideology. Praising the “political use of humor” by African Americans, she lamented its absence among today’s “university identity wallahs.”

The identity-synthesis advocates are driven by a noble ambition: to remedy the historic injustices that scar every country, including America. These injustices are and remain real. Although social movements and legislative reforms can help address them, the practice of politics, as the sociologist Max Weber famously wrote, is the “strong and slow boring of hard boards.” It rarely provides remedies as quickly or as comprehensively as hoped—leading some to conclude that a more radical break with the status quo is needed.

The appeal of the synthesis stems from promising just that. It claims to lay the conceptual groundwork necessary to remake the world by overcoming the reverence for long-standing principles that supposedly constrain our ability to achieve true equality. Advocates of the identity synthesis reject universal values like free speech as distractions that conceal and perpetuate the marginalization of minority groups. Trying to make progress toward a more just society by redoubling efforts to realize such ideals, its advocates claim, is a fool’s errand.

But these ideas will fail to deliver on their promises. For all their good intentions, they undermine progress toward genuine equality among members of different groups. Despite its allure, the identity synthesis turns out to be a trap.

As the identity synthesis has gained in influence, its flaws have become harder to ignore. A striking number of progressive advocacy groups, for example, have been consumed by internal meltdowns in recent years. “We used to want to make the world a better place,” a leader of one progressive organization complained recently. “Now we just make our organizations more miserable to work at.” As institutions such as the Sierra Club and the ACLU have implemented the norms inspired by the identity synthesis, they have had more difficulty serving their primary missions.

The identity synthesis is also starting to remake public policy in ways that are more likely to create a society of warring tribes. In the early months of the pandemic, for example, a key advisory committee to the CDC recommended that states prioritize essential workers in the rollout of scarce vaccines rather than the elderly, in part because “racial and ethnic minorities are underrepresented” among seniors. Not only did this policy, according to the CDC’s own models, have the probable outcome of increasing the overall number of Americans who would perish in the pandemic; it also placed different ethnic groups in competition with one another for lifesaving medications.

[Conor Friedersdorf: Intersectionality is not the problem]

When decision makers appear out of touch with the values and priorities of most citizens, demagogues thrive. The well-founded fears roused by the election of Trump accelerated the ascendancy of the identity synthesis in many elite institutions. Conversely, the newfound hold that these ideas now have over such institutions makes it more likely that he might win back the White House in 2024. The identity synthesis and far-right populism may at first glance appear to be polar opposites; in political practice, one is the yin to the other’s yang.

Many attacks on so-called wokeness are motivated by bad faith. They fundamentally misrepresent its nature. But that is no reason to deny how a new ideology has acquired such power in our society. In fact, it’s imperative to recognize that its founders explicitly saw themselves as rejecting widely held values, such as the core tenets of the civil-rights movement.

The lure of the identity synthesis to so many people is a desire to overcome persistent injustices and create a society of genuine equals. But the likely outcome of uncritically accepting this ideology is a society that places an unremitting emphasis on our differences. The effect is to pit rigidly defined identity groups against one another in a zero-sum battle for resources and recognition.

Critics of the identity trap commonly claim that progressive activists are “going too far.” But what is at issue is not having too much of a good thing. The real problem is that, even at its best, this ideology violates the ardent aspirations for a better future to which all of us should remain committed.

The Painful Afterlife of a Cruel Policy

The Atlantic

www.theatlantic.com › books › archive › 2023 › 09 › orphan-bachelors-bone-fae-myenne-ng-chinese-exclusion › 675385

In an age of democratized self-expression, you need not be Serena Williams or Prince Harry to write a memoir—or for people to want to read about your life. Not all of these first-person works are good, but more of them means that some will be good, even fascinating. Take an ever-swelling corner of the memoir market: those written about the Asian American experience. Identity, in these books, is a constant theme, but refreshingly, it plays out in all sorts of different registers—say, racial politics (Cathy Park Hong’s Minor Feelings) or grief (Michelle Zauner’s Crying in H Mart) or friendship (Hua Hsu’s Pulitzer Prize–winning Stay True). The most compelling of these create space for bigger questions—about the historical legacy of marginalization, or the nature of belonging—through the details of a particular set of lives.

A recent entrant into this arena reassures me that the proliferation of first-person storytelling is yielding outstanding works. Fae Myenne Ng’s Orphan Bachelors, an aching account of the author’s family in San Francisco’s Chinatown at the tail end of the Chinese Exclusion era, is an exemplar of the historical memoir.

Exclusion, which lasted from the late-19th century to World War II, was the United States’ official policy of forbidding immigration and citizenship to Chinese people. The orphan bachelors were the men who, during that period, came to work in America’s goldfields, on its railroads, or in its restaurants and laundries. Most came as “paper sons” who circumvented the law by falsely claiming to be the sons of Chinese American citizens. Trading their identities for fake ones, they toiled alone in America. Some had wives and children in China who could not legally come over, and those who were single suffered from a double exclusion—the law forbade not only immigration but also interracial marriage. These men are known in Cantonese as the lo wah que, the “old sojourners.”

Ng’s father called Exclusion a brilliant crime because it was bloodless: “four generations of the unborn.” Ng and her siblings were part of the first generation that repopulated their neighborhood after the lifting of Exclusion but before the immigration reforms of the 1960s. Beyond telling her family’s story, Ng memorializes an enclave stuck in time, its demographics twisted by cruel constraints. She shows that Exclusion has a reverberating and painful afterlife that dictates the limits of inclusion: One does not simply lead to the other.

Orphan bachelor is not a translation from Chinese, but a phrase that Ng’s father came up with. To her, it signals the tragedy and romance of the sojourners: their labor and loneliness, and also their hope. By the time Ng is coming up, these men are wizened and gray-haired; the generational shift is clear. Still, though the memoir plays out from Ng’s perspective, it is full of color from the old timers’ lives. As young girls, Ng and her sister respectfully address these men, who while away the time in Portsmouth Square, as “grandfather.” When she introduces them to us, she uses names that bespeak their individuality: Gung-fu Bachelor, Newspaper Bachelor, Hakka Bachelor, Scholar Bachelor. In the park, they argue politics and play chess. Some have jobs; others do not. They shuffle off, Ng writes, “their steps a Chinese American song of everlasting sorrow.”

From an early age, Ng seems to have an inclination toward history, and toward storytelling—tendencies that help her observe the bigger-picture currents at the edges of her family’s tale. She spends time with Scholar Bachelor in particular, who lives in an SRO hotel, works in a restaurant, and teaches in the Chinese school where the immigrants’ kids go in the afternoon after “English school.” A sincere, tyrannical teacher who recites Chinese poetry from the Tang dynasty, he encourages Ng, a budding writer, to look “to the old country for inspiration.”

Another orphan bachelor who influences Ng is her father, a merchant seaman and raconteur who can “take one fact and clothe it in lore.” He lived in San Francisco’s Chinatown for almost a decade before he went back to his ancestral village and found a wife, with whom he returned to California, after Exclusion lifted, to start a family. Like many who’ve faced unjust barriers and ongoing precarity, he tells tall tales filled with warlord violence, famine, and adversity. These stories are the currency traded among the orphan bachelors in the park, necessary in order to believe that their present misfortunes are not the worst. It may be bad in America, but not as bad as it was in China.

The impulse to narrate hardship—and, in so doing, lay claim to it—is evident in the relationship between Ng’s parents, who are full of pity, both for themselves and for each other. They have little in common other than their suffering, but even in that, they are competitive. Ng’s dad rails about the racism he has faced in the United States. Her mom retorts that “nothing compared to the brutality of Japan’s imperial army,” which she experienced growing up in pre-Communist China. Seeking relief from all of the fighting, Ng’s father ships out and leaves his wife and children for a month or more at a time. Her mom works as a seamstress, during the day at the sewing factory and at night at home; Ng and her sister go to sleep and wake up to the sound of the sewing machine.

Theirs is not a story of upward mobility or assimilation. Going to sea and sewing, the arguments and resentments—they all continue, even after the parents buy a small grocery store and a house on the outskirts of the city. In the 1960s, Ng’s father signs up for the U.S. government’s Chinese Confession Program, in which paper sons could “confess” their fake identities in exchange for the possibility of legalized status. The program is controversial: A single confession implicates an entire lineage, and there is no guarantee of being granted legal status (indeed, some are deported). Ng’s mom pressures Ng’s dad to confess; she wants to be able to bring her mother, whom she has not seen for decades, to the States. But confessing invalidates his legal status, and his citizenship isn’t restored until many years later.

[Read: Racism has always been a part of the Asian American experience]

Confession ruins the marriage. Still, there are small acts of devotion. When Ng’s mother is diagnosed with cancer, her father travels to Hong Kong and smuggles back an expensive traditional Chinese treatment: a jar of snake’s gallbladders, which he tenderly spoon-feeds her at her bedside. This ongoing tension is one of the memoir’s remarkable qualities. The story it tells is, in one sense, simply about the aches and dramas of a single family. But in another, its scope is more deeply existential. It considers the unjust constraints that can make unhappiness feel like fate, and the role that stubborn fealty can play in helping a family, somehow, stay together.

One of the things Ng’s dad, ever the weaver of yarns, teaches her is that stories always contain secrets; the important thing is to find the truth in them, however hidden they might be. That makes Orphan Bachelors something of an excavation—one that seems to build on a previous effort. Thirty years ago, Ng’s evocative debut novel, Bone, told a version of this story.

That novel was similarly focused on a family in San Francisco’s Chinatown during the Confession era: The mother is a seamstress and the stepfather is a merchant seaman; the marriage is fraught, buffeted by adversity; the first-person protagonist is, like Ng herself, the eldest daughter. In the novel, the middle daughter has jumped to her death from the rooftop of the Chinatown projects. The sister’s death is the plot device that forces a reckoning with the lies that fester in the family’s troubled relationships—and the bigger lies that have structured the lives of the paper sons.

Bone is full of minimalist but distinctive place-setting details—a chicken being plucked “till it was completely bald,” the culottes the mother must sew to meet popular demand in the flower-power ’60s. In Orphan Bachelors, Ng has enriched the environment further by attending to linguistic subtleties. She understands what language can reveal about identity formation—what it creates and enables, what it denies and obscures. Of the subdialect of Cantonese that she hears crisscrossing the neighborhood while growing up, Ng writes, “Our Toishan was a thug’s dialect, the Tong Man’s hatchetspeak. Every curse was a plunging dagger. Kill. Kill. You.” (It’s written in English, and although I can hear the Chinese, non-Chinese speakers will have no trouble getting it.) The second-generation children live in between languages, “obedient, polite, and respectful” in English school, yet like “firecrackers” in Chinese school. “We talked back. We never shut up,” Ng writes. “Our teachers grimaced at our twisty English-laced Chinese. We were Americans and we made trouble.”

In a way, the secret that Ng reveals about this era—across fiction and memoir—is how the trauma of Exclusion is transferred from one generation to the next: the complications of true and fake family histories, the desire of the younger generation to unburden themselves of that difficult inheritance, the impossibility of actually escaping it. In Bone, we see the dissonance between familial duty and selfhood playing out from a young woman’s point of view. Orphan Bachelors captures the longer arc of Ng’s life as a Chinatown daughter, including her parents’ deaths. The struggle to balance devotion to your elders with living your own life, it suggests, does not necessarily end when those elders have passed away.

As a historian who has written three books on aspects of Chinese Exclusion, I have explained how Exclusion separated families and how Confession separated them still. I hope I have told the story well enough. I am grateful to Ng for lending her voice to this history and crafting a narrative that reckons with this period’s devastating psychic costs. The storyteller’s delusion, as Ng puts it in Orphan Bachelors, is the belief that if you tell the story right, you will be understood. It may be an impossible task, but with this latest endeavor, she is getting closer.

So Much for ‘Learn to Code’

The Atlantic

www.theatlantic.com › technology › archive › 2023 › 09 › computer-science-degree-value-generative-ai-age › 675452

The quickest way to second-guess a decision to major in English is this: have an extended family full of Salvadoran immigrants and pragmatic midwesterners. The ability to recite Chaucer in the original Middle English was unlikely to land me a job that would pay off my student loans and help me save for retirement, they suggested when I was a college freshman still figuring out my future. I stuck with English, but when my B.A. eventually spat me out into the thick of the Great Recession, I worried that they’d been right.

After all, computer-science degrees, and certainly not English, have long been sold to college students as among the safest paths toward 21st-century job security. Coding jobs are plentiful across industries, and the pay is good—even after the tech layoffs of the past year. The average starting salary for someone with a computer-science degree is significantly higher than that of a mid-career English graduate, according to the Federal Reserve; at Google, an entry-level software engineer reportedly makes $184,000, and that doesn’t include the free meals, massages, and other perks. Perhaps nothing has defined higher education over the past two decades more than the rise of computer science and STEM. Since 2016, enrollment in undergraduate computer-science programs has increased nearly 49 percent. Meanwhile, humanities enrollments across the United States have withered at a clip—in some cases, shrinking entire departments to nonexistence.

But that was before the age of generative AI. ChatGPT and other chatbots can do more than compose full essays in an instant; they can also write lines of code in any number of programming languages. You can’t just type make me a video game into ChatGPT and get something that’s playable on the other end, but many programmers have now developed rudimentary smartphone apps coded by AI. In the ultimate irony, software engineers helped create AI, and now they are the American workers who think it will have the biggest impact on their livelihoods, according to a new survey from Pew Research Center. So much for learning to code.

ChatGPT cannot yet write a better essay than a human author can, nor can it code better than a garden-variety developer, but something has changed even in the 10 months since its introduction. Coders are now using AI as a sort of souped-up Clippy to accelerate the more routine parts of their job, such as debugging lines of code. In one study, software developers with access to GitHub’s Copilot chatbot were able to finish a coding task 56 percent faster than those who did it solo. In 10 years, or maybe five, coding bots may be able to do so much more.

People will still get jobs, though they may not be as lucrative, says Matt Welsh, a former Harvard computer-science professor and entrepreneur. He hypothesizes that automation will lower the barrier to entry into the field: More people might get more jobs in software, guiding the machines toward ever-faster production. This development could make highly skilled developers even more essential in the tech ecosystem. But Welsh also says that an expanded talent pool “may change the economics of the situation,” possibly leading to lower pay and diminished job security.

If mid-career developers have to fret about what automation might soon do to their job, students are in the especially tough spot of anticipating the long-term implications before they even start their career. “The question of what it will look like for a student to go through an undergraduate program in computer science, graduate with that degree, and go on into the industry … That is something I do worry about,” Timothy Richards, a computer-science professor at the University of Massachusetts at Amherst, told me. Not only do teachers like Richards have to wrestle with just how worthwhile learning to code is anymore, but even teaching students to code has become a tougher task. ChatGPT and other chatbots can handle some of the basic tasks in any introductory class, such as finding problems with blocks of code. Some students might habitually use ChatGPT to cheat on their assignments, eventually collecting their diploma without having learned how to do the work themselves.

Richards has already started to tweak his approach. He now tells his introductory-programming students to use AI the way a math student would use a calculator, asking that they disclose the exact prompts they fed into the machine, and explain their reasoning. Instead of taking assignments home, Richards’s students now do the bulk of their work in the classroom, under his supervision. “I don’t think we can really teach students in the way that we’ve been teaching them for a long time, at least not in computer science,” he said.

Fiddling with the computer-science curriculum still might not be enough to maintain coding’s spot at the top of the higher-education hierarchy. “Prompt engineering,” which entails feeding phrases to large language models to make their responses more human-sounding, has already surfaced as a lucrative job option—and one perhaps better suited to English majors than computer-science grads. “Machines can’t be creative; at best, they’re very elaborate derivatives,” says Ben Royce, an AI lecturer at Columbia University. Chatbots don’t know what to do with a novel coding problem. They sputter and choke. They make stuff up. As AI becomes more sophisticated and better able to code, programmers may be tasked with leaning into the parts of their job that draw on conceptual ingenuity as opposed to sheer technical know-how. Those who are able to think more entrepreneurially—the tinkerers and the question-askers—will be the ones who tend to be almost immune to automation in the workforce.

The potential decline of “learn to code” doesn’t mean that the technologists are doomed to become the authors of their own obsolescence, nor that the English majors were right all along (I wish). Rather, the turmoil presented by AI could signal that exactly what students decide to major in is less important than an ability to think conceptually about the various problems that technology could help us solve. The next great Silicon Valley juggernaut might be seeded by a humanities grad with no coding expertise or a computer-science grad with lots of it. After all, the discipline has always been about more than just learning the ropes of Python and C++. Identifying patterns and piecing them together is its essence.

In that way, the answer to the question of what happens next in higher education may lie in what the machines can’t do. Royce pointed me toward Moravec’s paradox, the observation that AI shines at high-level reasoning and the kinds of skills that are generally considered to reflect cognitive aptitude (think: playing chess), but fumbles with the basics ones. The curiosity-driven instincts that have always been at the root of how humans create things are not just sticking around in an AI world; they are now more important than ever. Thankfully, students have plenty of ways to get there.