Itemoids

United States

Woke Is Just Another Word for Liberal

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 03 › bethany-mandel-woke-interview-definition › 673454

The conservative writer Bethany Mandel, a co-author of a new book attacking “wokeness” as “a new version of leftism that is aimed at your child,” recently froze up on a cable news program when asked by an interviewer how she defines woke, the term her book is about.

On the one hand, any of us with a public-facing job could have a similar moment of disassociation on live television. On the other hand, the moment and the debate it sparked revealed something important. Much of the utility of woke as a political epithet is tied to its ambiguity; it often allows its users to condemn something without making the grounds of their objection uncomfortably explicit.

A few years ago, I wrote, “Woke is a nebulous term stolen from Black American English, repurposed by conservatives as an epithet to express opposition to forms of egalitarianism they find ridiculous or distasteful.” This is what people mean when they refer to “woke banks” or “woke capital,” when they complain that the new Lord of the Rings series or the new Little Mermaid is “woke” because it includes Black actors, or when they argue for a “great unwokening” that would roll back civil-rights laws. Part of the utility of the term is that it can displace the criticism onto white liberals who are insincere about their egalitarianism, rather than appearing to be an attack on egalitarianism itself. In fact, woke has become so popular as a political epithet that providing an exhaustive list of definitions would be difficult. It is a slippery enough term that you can use it to sound like you are criticizing behavior most people think is silly, even if you are really referring to things most people think of as good or necessary.

[Adam Serwer: ‘Woke capital’ doesn’t exist]

This is not the only way that the term is employed—although it is almost always used as a pejorative now, whereas originally it could be sincere or ironic. Some commentators have used it as a shorthand for toxic dynamics in left-wing discourse and advocacy. “Wokeness refers to the invocation of unintuitive and morally burdensome political norms and ideas in a manner which suggests they are self-evident,” Sam Adler-Bell wrote in New York magazine. “At other times, it means we express fealty to a novel or unintuitive norm, while suggesting that anyone who doesn’t already agree with it is a bad person.”

Adler-Bell is describing a real phenomenon in left-of-center communities, but right-wing opposition to woke discourse is less about the mode of expression than its content. Suffice it to say, though, that no ideology is so pure or benign that it renders its adherents incapable of being cruel, selfish, or self-aggrandizing—especially in a social-media panopticon where everyone is seeking to raise or protect their own status, often at the expense of others.

Mandel herself later offered this definition of woke on Twitter: “A radical belief system suggesting that our institutions are built around discrimination, and claiming that all disparity is a result of that discrimination. It seeks a radical redefinition of society in which equality of group result is the endpoint, enforced by an angry mob.” The right-wing pundit Ben Shapiro offered a similar description.

I like Mandel’s definition because it makes the concept seem so reasonable that it requires a few modifiers and a straw man about mob enforcement to evoke the proper amount of dread in the reader. If you describe the ethnic cleansing of Muslims in Eastern Europe in the 1990s, you don’t need to add that it was “radical” to get most people to understand that it was bad. But the claim that “American institutions are built around discrimination” is just a straightforward account of history. And if few of the people who are caricatured as woke would argue that all disparities result from discrimination, most of them would agree that many key disparities along the axes of class, race, and gender do. But either the history, policy, and structure of the American economy matter or they don’t.

To claim the reverse, that people who are rich or white or male are just better than everyone else—to object to “equality of group result” as a goal, as if it’s absurd to believe that people from across the boundaries of the biological fiction of race could be equal—reveals a prejudice so overt that it practically affirms the “woke side of the argument. The “radical redefinition of society” that many of the so-called woke seek is simply that it lives up to its stated commitments. And one really could, I suppose, describe that as radical—the abolition of slavery, the ratification of women’s suffrage, and the end of Jim Crow were all once genuinely radical positions whose adoption redefined American society.

[David A. Graham: Wokeness has replaced socialism as the great conservative bogeyman]

Those transitions were only possible because, as Mandel’s definition inadvertently concedes, the ideology she opposes is grounded in fact. The United States could not have been created without displacing the people who were already living here. Its Constitution preserved slavery, which remained an engine of the national economy well into the 19th century. Among the first pieces of federal legislation was a bill limiting naturalization to free white people. Yet not even all white men could vote at the nation’s founding—property requirements shut out many until around 1840—and universal white male suffrage (sometimes including noncitizens!) was paired with the explicit disenfranchisement of Black men, even in some northern states. The nation was nearly rent in two because the slave economy and the social hierarchy it created were precious enough, even to men who did not own slaves, that they took up arms to defend the institution of human bondage with their life. After the Civil War, the former Confederates reimposed white supremacy and subjected the emancipated to an apartheid regime in which they had few real rights, a regime my mother was born into and my grandparents fled. For most of the history of the United States, Black people could not vote and women could not vote; American immigration policy in the early 20th century was based on eugenics and an explicit desire to keep out those deemed nonwhite; the mid-century American prosperity unleashed by the New Deal that conservatives recall with such nostalgia was stratified by race.

I could go on, but I think you get the point. These things are real; they happened. To believe that the disadvantages of race, class, and gender imposed lawfully over centuries never occurred or entirely disappeared in just a few decades is genuinely “radical” in a negative way; to believe that creating those disadvantages was wrong and that they should be rectified is not. The idea that no one ever succeeds based on advantages unrelated to their personal abilities is likewise radical, and also ludicrous. But you can, perhaps, understand why one of the richest men in the world would consider the opposing idea—that where many people end up in life is the result of unearned advantages—to be a “woke mind virus” that should be eradicated. That kind of thinking leads to higher marginal tax rates for people with private planes.

Some people so deeply resent the implication that they possess any unearned advantage that, in Republican-run states all over the country, the same folks who were recently shrieking about free speech and oversensitive snowflakes are busy using the power of the state to ban discussions about factual matters that might hurt their feelings, such as descriptions of racial segregation in the story of Rosa Parks. The irony here is that by framing everything they don’t like as a symptom of pervasive oppression against white people or Christians that must be rectified by the state, they have themselves adopted the inverse of the logic they decry as “wokeness.” They believe that America’s demographic majorities are the targets of broad institutional discrimination, which is unjust not because such discrimination is morally abhorrent but because it is targeted at the wrong people.  

Then there is the irony that the most zealous among the so-called woke and anti-woke form different denominations of the same religion, following high priests of racial salvation preaching parallel dogmas, one of which says that you need only read certain books or say certain words to attain salvation, and the other of which grants absolution to parishioners for their reflexive contempt for those they despise. Only one of them, however, has become the established church in certain states, deploying the power of the state to enforce its dogma.

You need not adopt either faith. Accepting the reality of American history and the persistence of discrimination does not mean that every egalitarian proposal is correct, nor that every egalitarian argument should be heeded. It does not necessarily mean that we should ban the SAT in college admissions or never refer to “women” when discussing abortion rights. Calling something racist or sexist doesn’t mean that what you are describing is racist or sexist. Conversely, something that appears to be race-neutral can be implemented in a discriminatory fashion, or even adopted with that intention. But if you do accept the reality of our past, then you probably think we should try to level the playing field in some way. The merits of specific arguments or proposals are separate from that underlying principle. Whatever woke might mean, however, it is clear that the objections of the militantly “anti-woke” find the egalitarian idea itself to be worthy of contempt.

To say that traditional hierarchies are just and good, well, that’s simply conservatism. It has been since the 18th century. And to say that those hierarchies do not reflect justice and that people should be equal under the law—all the people, not only propertied white men—well, that’s more or less just liberalism. But if you don’t like it, you’d probably call it woke.

The One Cause of Poverty That’s Never Considered

The Atlantic

www.theatlantic.com › books › archive › 2023 › 03 › poverty-by-america-book-matthew-desmond › 673453

In the United States, a staggeringly wealthy country, one in nine people—and one in eight children—is officially poor. Those figures have fluctuated only slightly over half a century, during which scholars and journalists have exhaustively debated the reasons for the lack of progress. Training their attention on the lives of the dispossessed, researchers have identified barriers that keep people at the bottom of the social ladder from climbing its rungs, and offered arguments that usually play out along ideological lines. According to conservatives, the most significant obstacles are behavioral: family breakdown and debilitating habits such as dependency and idleness, exacerbated, they believe, by the receipt of government handouts. According to liberals, the real problems are structural: forces such as racism and deindustrialization, which, they contend, have entrenched inequality and prevented disadvantaged groups from sharing in the nation’s prosperity.

But what if the focus on the disadvantaged is misplaced? What if the persistence of poverty has less to do with the misfortunes of the needy than with the advantages the affluent presume they are entitled to? In Poverty, by America, Matthew Desmond, a sociologist at Princeton, argues that we need to examine the behavior and priorities not of the poor but of “those of us living lives of privilege and plenty.”

Desmond’s decision to spotlight the privileged may surprise readers familiar with his previous work, in particular his widely acclaimed book Evicted, which told the stories of eight impoverished families struggling to find and keep affordable apartments in Milwaukee. The prevailing assumption among scholars had been that most poor people in America’s urban areas lived in public housing. In fact, only 15 percent of low-income renters in the United States fell into this category, Desmond found. The rest had to navigate the private rental market, where many ended up spending more than half their income on dilapidated units with busted appliances and roach infestations. The indignity of living in such conditions was compounded by the fear of falling behind on the rent and getting evicted, traumatic expulsions that happened with shocking regularity in places like Milwaukee’s impoverished North Side. A similar convulsive pattern of dislocations played out in low-income neighborhoods all over the country.

Evicted revealed a fundamental fact about poverty in America that had gone largely unrecognized but caused indelible harm: Routine ejections dislodged children from schools, destabilized blocks, and upended the lives of vulnerable people stripped again and again of their possessions and their dignity. Desmond captured the human toll up close, spending extensive time with his subjects—among them, a Navy veteran named Lamar with no legs and two sons to care for, circumstances that did not prevent his landlord from handing him an eviction notice one day. “I love Lamar,” the landlord told Desmond, “but love don’t pay the bills.” As unsparing as his portrait of the callousness that fueled poverty was, Desmond studiously avoided portraying his characters as hapless victims or brushing over the flawed decisions they sometimes made (Lamar lost his legs after getting high on crack and jumping out a window, we learn).

Readers expecting more of the rich narrative texture of that endeavor will be disappointed. Unlike Evicted, which was grounded in years of fieldwork, Desmond’s new book contains little in the way of original ethnographic research and, though it has its share of startling statistics, lacks a vivid cast of characters. Working in a very different register, Desmond instead offers a passionate and provocative argument fueled by his dismay about the extent of poverty in America—and by his dissatisfaction with conventional explanations for how little has changed.

[Read: Poverty is violent]

The safety net in America is more threadbare than in most wealthy countries. But growing government stinginess over the years doesn’t explain the high number of destitute people, he argues, citing a surprising figure: Government spending on the 13 largest means-tested programs—targeted at people under a certain income level—has actually risen 237 percent from 1980 to 2018. Neither does the emphasis on the breakdown of the family and the rise in single-parent households, a theme of the 1965 “Moynihan Report” that has had a long afterlife. It’s true that families in the United States headed by single mothers are far more likely to be poor than married families, Desmond acknowledges. But the disparity vanishes in countries such as Italy and Sweden, thanks to social programs (affordable child care, paid family leave) that aren’t available here. In this respect, the lack of government spending clearly has played a role.

Desmond is equally dismissive of theories that focus on impersonal structural problems as the main impediments. It’s not that he discounts the significance of such factors—in Evicted, he described Milwaukee as “the epicenter of deindustrialization,” and he takes pains to document how racism and poverty intersect. But he is impatient with the implication that poverty arises passively, because of large forces beyond our control. To the contrary, poverty is a social reality that Americans create and sustain—and from which many of the nonpoor benefit materially in more direct ways than they care to acknowledge. Desmond’s mission is to disabuse the better-off among us of the illusion that they are mere bystanders with their hands tied. Many wealthy people assume that the built-in advantages that come with affluence are their due, and take for granted their freedom to choose among many life options. What they fail to recognize is that their choices contribute to foreclosing options for people of lesser means, whose lives are already far more constrained.

To readers of Evicted, which is subtitled “Poverty and Profit in the American City,” the idea that sustained poverty doesn’t just happen but is abetted by the conduct of more powerful social actors will not sound new. Interspersed with scenes of movers dumping children’s toys on sidewalks as their dejected mothers looked on were descriptions of landlords convening at meetings of the Milwaukee Real Estate Investors Networking Group, where the keynote speaker described the “very nice cash flow” generated by renting rooming houses to poor single men. Poverty did not materialize because of blind forces, such gatherings suggested. It resulted from greed and from exploitation, a word that Desmond complained “has been scrubbed out of the poverty debate.”

Desmond now aims to rectify this omission by demonstrating how exploitation in various forms is the root of the problem. Housing, unsurprisingly, figures in the foreground of his analysis. When renting apartments, the poor are routinely forced to pay more, because landlords know they lack good alternatives because of the dearth of public housing. But poor renters also suffer at the hands of a less familiar cast of exploiters: educated and affluent people who impinge on their choices and benefit from their powerlessness more subtly but no less consequentially.

The imposition of “municipal zoning ordinances,” enacted through referenda pushed by citizens’ groups and homeowners’ associations, and which prohibit the construction of multifamily apartment complexes in upscale neighborhoods, is a case in point. These benign-sounding rules foster segregation, effectively preventing the poor (whose presence, it’s assumed, could drive down property values) from moving in—which, in turn, consigns them to neighborhoods of concentrated disadvantage. As Desmond notes, such policies are one of the few issues that Americans in red and blue states seem to agree on. A study he cites found that liberals and conservatives were pretty much equally unlikely to support building a 120-unit apartment complex for housing-voucher recipients in their community.

In Desmond’s taxonomy of how the privileged “make the poor in America poor,” the strategy of living in walled-off communities—and more broadly, the proclivity to invest in private amenities at the expense of public housing, education, and transportation—has more than an economic impact. The physical separation also “poisons our minds and souls,” enabling affluent people to forget about the poor and obscuring two other tiers of economic exploitation that Desmond calls attention to.

Surveying a host of other perks and benefits to which the well-off consider themselves entitled, he emphasizes that such life amenities are available only because poor people suffer: When the wealthy patronize shops and restaurants that offer low prices and fast service, their satisfaction comes at the expense of cashiers and dishwashers paid poverty wages. When we open free checking accounts that require maintaining a minimum balance, we benefit from the fact that banks can collect billions of dollars in overdraft fees from poor customers who struggle to meet these requirements—and who often end up gouged by check-cashing outlets and payday lenders.

“Anyone who has ever struggled with poverty knows how extremely expensive it is to be poor,” James Baldwin once observed, an irony that still prevails in America. Finally, Desmond flags the provision of federal welfare—the tax breaks and subsidies (among them the mortgage-interest deduction, a very large housing perk) reserved for the haves, which far outstrip the programs and benefits that assist the have-nots. The average household in the top 20 percent income bracket receives $35,363 in annual tax breaks and other government benefits—40 percent more than the average household in the bottom 20 percent. Federal spending, Desmond reminds readers, is a zero-sum game—a game the wealthy and powerful invariably dominate.

Desmond’s emphasis on the link between wealth and poverty, and on class exploitation, may lead some readers to brand him a Marxist. His thinking about poverty indeed owes a debt to an influential 19th-century writer, but the writer in question isn’t Marx. It’s Leo Tolstoy, who, after publishing Anna Karenina, moved to Moscow, where the poverty he witnessed shocked him. As Desmond recounts, the experience prompted the writer, who lived in a house full of servants, to contend with his complicity and shame. “I sit on a man’s back, choking him and making him carry me, and yet assure myself and others that I am very sorry for him and wish to ease his lot by all possible means—except by getting off his back,” Tolstoy mused. “If I want to aid the poor, that is, to help the poor not to be poor, I ought not to make them poor.”

Desmond has written his book in the hope that exposing our complicity will catalyze a similar crisis of conscience. Shifting into exhortation mode, he invokes Tolstoy’s realization as a model for the “poverty abolitionists” that America needs and that he hopes to mobilize into a movement. As he presses this case, Desmond sometimes sounds less like a dispassionate social scientist than a missionary, convinced that the privileged can be moved to act if only they open their eyes and acknowledge how implicated they are in the suffering of the poor. Yet Desmond’s own map of the entrenched ways we keep the disadvantaged down can’t help but raise doubts about how many converts he will persuade to join such a crusade.  

[Read: The enemy of poor Americans]

Tolstoy’s epiphany was inspired by proximity: He had to see poverty up close to viscerally experience shame and responsibility. In Evicted, as in his new book, Desmond emphasizes that poverty is a relationship in which we are all enmeshed. In doing so, he’s pushing back against a long-standing emphasis on the isolation and invisibility of the poor, a theme in some of the best-known literature on poverty—from Jacob Riis’s How the Other Half Lives (1890) to Michael Harrington’s The Other America (1962) and beyond. But even as he insists that “our lives are interlaced with the lives of the poor,” Desmond himself describes a “bifurcated” country divided by ever more impenetrable barriers that keep the poor in their place: out of expensive neighborhoods, in low-wage jobs, at schools and day-care centers that children from wealthier families don’t attend. A society in which the poor and nonpoor so rarely brush shoulders is, as he’s shown, a society designed to allow the affluent to not see how they benefit from others’ hardships. (In cities like New York and Los Angeles, where homeless people are more visible, it’s also notable that many of the proposed solutions are to get the poor out of sight rather than to direct resources toward them.)

Desmond is convinced that the time is ripe for a manifesto like his. The growing awareness of inequality may indeed have shifted popular attitudes, making Americans less inclined to view the poor as lazy and undeserving than was the case during the Reagan years or the 1990s, when Bill Clinton announced that the “era of big government is over” and enacted welfare reform. But there are also reasons to wonder if the current moment is an auspicious time for a large-scale push to end poverty. The War on Poverty was launched in an age of prosperity, when many believed the economy was strong enough to lift all boats. In times of insecurity like ours, when workers in the hollowed-out middle fear slipping downward, the zero-sum game gets only more intense.

What is “maddening,” Desmond writes, is “how utterly easy it is to find enough money to defeat poverty by closing nonsensical tax loopholes,” or by doing 20 or 30 smaller things to curtail just some of the subsidies of affluence. Yet his book makes it all too clear why the loopholes don’t get closed. The real reason the well-off sustain the status quo isn’t that they believe the poor are shiftless. It’s because meaningful change would require giving up their own advantages—or, to put it bluntly, because “we like it,” as Desmond writes. This is, he notes, the “rudest explanation” for our current state of affairs. Getting affluent people to engage in rhetorical hand-wringing over inequality is easy enough. Persuading them to yield some of their entitlements is a lot harder.

I Supported the Invasion of Iraq

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 03 › i-supported-the-invasion-of-iraq › 673452

This story seems to be about:

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Twenty years after the United States led a coalition to overthrow Saddam Hussein, the conventional wisdom is now that the postwar fiasco proved that the war was a mistake from its inception. The war, as it was executed, was indeed a disaster, but there was ample cause for launching it.

First, here are four new stories from The Atlantic:

Zelensky has an answer for DeSantis. This is not great news for Donald Trump. Cool people accidentally saved America’s feet. Did Vermeer’s daughter paint some of his best-known works? Just War

I supported the invasion of Iraq in 2003. I have changed my mind about some things but not everything, and I hope you’ll bear with me in a somewhat longer edition of the Daily today for a personal exploration of the issue.

In retrospect, almost no American war except the great crusade against the Axis seems to have been necessary, especially for the people who have had to go and fight such conflicts. How could we have asked our military men and women to endure death and mutilation and horror in 1991 so that a bunch of rich Kuwaitis could return to their mansions, or in 2003 so that we could finally settle scores with a regional dictator? Yesterday, The Bulwark ran a searing, must-read reminiscence of the Iraq War written by a U.S. veteran that reminds us how high-flown ideas such as “national interest” or “international order” play little role on the actual battlefield.

And yet, there are just wars: conflicts that require the use of armed force on behalf of an ally or for the greater good of the international community. I was an advocate for deposing Saddam by the mid-1990s on such grounds. Here is what I wrote in the journal Ethics & International Affairs on the eve of the invasion in March 2003:

The record provides ample evidence of the justice of a war against Saddam Hussein’s regime. Iraq has shown itself to be a serial aggressor led by a dictator willing to run imprudent risks, including an attack on the civilians of a noncombatant nation during the Persian Gulf War; a supreme enemy of human rights that has already used weapons of mass destruction against civilians; a consistent violator of both UN resolutions and the terms of the 1991 cease-fire treaty, to say nothing of the laws of armed conflict and the Geneva Conventions before and since the Persian Gulf War; a terrorist entity that has attempted to reach beyond its own borders to support and engage in illegal activities that have included the attempted assassination of a former U.S. president; and most important, a state that has relentlessly sought nuclear arms against all international demands that it cease such efforts.

Any one of these would be sufficient cause to remove Saddam and his regime(and wars have started over less), but taken together they are a brief for what can only be considered a just war.

Today, there is not a word of this I would take back as an indictment of Saddam Hussein or as justification for the use of force. But although I believed that the war could be justified on these multiple grounds, the George W. Bush administration chose a morally far weaker argument for a preventive war, ostensibly to counter a gathering threat of weapons of mass destruction. (Preemptive war, by the way, is a war to avert an imminent attack, and generally permissible in international law and custom. Preventive war is going to war on your own timetable to snuff out a possible future threat, a practice long rejected by the international community as immoral and illegal. The Israeli move at the opening of the Six-Day War, in 1967, was preemptive; the Japanese attack on Pearl Harbor, in 1941, was preventive.

Of course, the Iraqi dictator was doing his damndest to convince the world that he had weapons of mass destruction, because he was terrified of admitting to his worst foe, Iran, that he no longer had them. (He sure convinced me.) But this was no evidence of an imminent threat requiring instant action, and the WMD charge was the shakiest of limbs in a tree full of much stronger branches.

Bush used the WMD rationale as just one in a kitchen sink of issues, likely because his advisers thought it was the case that would most resonate with the public after the September 11 terror attacks. For years, most Western governments saw terrorism, rogue states, and WMD as three separate problems, to be handled by different means. After 9/11, these three issues threaded together into one giant problem—a rogue state supporting terrorists who seek to do mass damage—and the tolerance for risk that protected the Iraqi tyrant for so many years evaporated.

In 2003, I was far too confident in the ability of my own government to run a war of regime change, which managed to turn a quick operational victory into one of the greatest geopolitical disasters in American history. Knowing what I now know, I would not have advocated for setting the wheels of war in motion. And although Bush bears the ultimate responsibility for this war, I could not have imagined how much Secretary of Defense Donald Rumsfeld’s obsession with “transformation,” the idea that the U.S. military could do more with fewer troops and lighter forces, would undermine our ability to conduct a war against Iraq. As Eliot Cohen later said, “The thing I know now that I did not know then is just how incredibly incompetent we would be, which is the most sobering part of all this.”

My own unease about the war began when America’s de facto military governor, Paul Bremer, disbanded the Iraqi military and embarked on “de-Baathification,” taking as his historical analogy the “denazification” of Germany after World War II. This was bad history and bad policy, and it created a massive unemployment problem among people skilled in violence while punishing civilians whose only real association with Baathism was the party card required for them to get a good job.

And yet, for a few years more, I stayed the course. I believed that Iraqis, like anyone else, wanted to be free. They might not be Jeffersonian democrats, but they hated Saddam, and now they had a chance at something better. Like many of our leaders, I was still amazed at the collapse of the Soviet Union, appalled at Western inaction in places like Rwanda, and convinced (as I still am) that U.S. foreign policy should be premised on a kind of Spider-Man doctrine: With great power comes great responsibility.

Unfortunately, in my case, this turned into supporting what the late Charles Krauthammer in 1999 called “a blanket anti-son of a bitch policy,” which he described as “soothing, satisfying and empty. It is not a policy at all but righteous self-delusion.” Krauthammer was right, and people like me were too willing to argue for taking out bad guys merely because they were bad guys. But that word blanket was doing a lot of lifting in Krauthammer’s formulation; perhaps we cannot go after all of them, but some sons of bitches should be high on the list. For me, Saddam was one of them.

The question now was whether even Saddam Hussein was worth the cost. Twenty years ago, I would have said yes. Today, I would say no—but I must add the caveat that no one knew then, nor can anyone know now, how much more dangerous a world we might have faced with Saddam and his psychopathic sons still in power. (Is the world better off because we left Bashar al-Assad in power and allowed him to turn Syria into an abattoir?) Yes, some rulers are too dangerous to remove; Vladimir Putin, hiding in the Kremlin behind a wall of nuclear weapons, comes to mind. Some, however, are too dangerous to allow to remain in power, and in 2003, I included Saddam in that group.

In 2007, Vanity Fair interviewed a group of the war’s most well-known supporters. Even the ur-hawk Richard Perle (nicknamed in Washington the “Prince of Darkness” when he worked for Ronald Reagan) admitted that, if he had it to do over again, he might have argued for some path other than war. But the comment that sticks with me to this day, and the one that best represents my thinking, came from Ambassador Kenneth Adelman. In 2002, Adelman famously declared that the war would be “a cakewalk,” but five years later, he said:

The policy can be absolutely right, and noble, beneficial, but if you can’t execute it, it’s useless, just useless. I guess that’s what I would have said: that Bush’s arguments are absolutely right, but you know what? You just have to put them in the drawer marked CAN’T DO. And that’s very different from LET’S GO.

Twenty years later, that’s where I remain. The cause was just, but there are times when doing what’s right and just is not possible. For almost 15 years after the fall of the Soviet Union and the first Allied victory over Iraq, the United States had the chance to deepen the importance of international institutions. We squandered that opportunity because of poor leadership, Pentagon fads (the “Office of Force Transformation” was disbanded in 2006, shortly before Bush finally removed Rumsfeld), and amateurish historical analogies.

Still, there’s too much revisionist history about the Iraq War. You’ll see arguments that experts supported it. (Most academics and many civilians in D.C. did not.) You’ll hear that it was a right-wing crusade backed only by a Republican minority. (Also wrong.) Had the war been executed differently, we might be having a different conversation today.

The fact remains that the United States is a great power protecting an international system it helped to create, and there will be times when military action is necessary. Fortunately, most Americans still seem to grasp this important reality.

Would I argue for another such operation today? If the question means “another massive preventive war far from home,” no. I have consistently opposed war with Iran and any direct U.S. involvement in Ukraine. I wrote a book in 2008 warning that we should strengthen the United Nations and other institutions to stop the growing acceptance around the world of preventive war as a normal tool of statecraft.

I also, however, supported the NATO operation in Libya, and I have called for using American airpower to blunt Assad’s mass murders in Syria. Iraq was a terrible mistake, but it would be another mistake to draw the single-minded conclusion (much as we did after Vietnam) that everything everywhere will forever be another Iraq. The world is too dangerous, and American leadership too necessary, for us to fall into such a facile and paralyzing trap.

Related:

David Frum: the Iraq War reconsidered The enduring lessons of the “axis of evil” speech Today’s News French President Emmanuel Macron’s government survived a no-confidence motion by nine votes, the result of widespread backlash to a bill that would raise the retirement age in France from 62 to 64. President Joe Biden issued the first veto of his presidency, on a resolution to overturn a retirement-investment rule allowing managers of retirement funds to consider environmental and social factors when choosing investments. Chinese leader Xi Jinping visited the Kremlin, where he and Russian President Vladimir Putin greeted each other as “dear friend.” Washington denounced the visit.

Dispatches

Up for Debate: Conor Friedersdorf rounds up more reader replies on the freedom and frustration of cars.

Explore all of our newsletters here.

Evening Read

Illustration by Daniel Zender / The Atlantic; Getty

Please Get Me Out of Dead-Dog TikTok

By Caroline Mimbs Nyce

A brown dog, muzzle gone gray—surely from a life well lived—tries to climb three steps but falters. Her legs give out, and she twists and falls. A Rottweiler limps around a kitchen. A golden retriever pants in a vet’s office, then he’s placed on a table, wrapped in medical tubes. “Bye, buddy,” a voice says off camera. Nearby, a hand picks up a syringe.

This is Dead-Dog TikTok. It is an algorithmic loop of pet death: of sick and senior dogs living their last day on Earth, of final hours spent clinging to one another in the veterinarian’s office, of the brutal grief that follows in the aftermath. One related trend invites owners to share the moment they knew it was time—time unspecified, but clear: Share the moment you decided to euthanize your dog.

Read the full article.

More From The Atlantic

Photos: the beauty of Earth from orbit The obscure maritime law that ruins your commute Culture Break Matt Chase / The Atlantic; source: Getty

Read. These eight books will take you somewhere new.

Watch. Abbott Elementary, on ABC (and available to stream on Hulu).

Our writer Jerusalem Demsas endorsed the show this weekend: “I’m someone who can usually only watch TV while doing at least one or two other things at the same time, and this show grabs my full attention.”

Play our daily crossword.

P. S.

No recommendations today, other than to thank our veterans for shouldering the burden of a war that we asked them to fight.

— Tom