Vietnam’s richest man is vying to dethrone Asia’s ride-hailing king, Grab
This story seems to be about:
This story seems to be about:
Search:
This story seems to be about:
This story seems to be about:
At its best, a presidential biopic can delve into the monomaniacal focus—and potential narcissism—that might drive a person to run for the White House in the first place. That’s what Oliver Stone did in 1995’s Nixon, dramatizing the 37th president’s downfall with the exhilarating paranoia of the director’s best work. Though guilty of some fact-fudging, Stone retained empathy for Richard Nixon’s childhood trauma and lifelong inferiority complex, delivering a Shakespearean tragedy filtered through a grim vision of American power. As Nixon (played by a hunched, scotch-guzzling Anthony Hopkins) stalks the halls of a White House engulfed by scandal, and stews with jealousy at the late John F. Kennedy, the presidency never seemed so lonely.
A presidential biopic can also zoom in on a crucial juncture in a leader’s life: Steven Spielberg’s Lincoln explored its protagonist’s fraught final months, during which he pushed, at great political risk, for a constitutional amendment to abolish slavery. Spielberg’s film was captivating because it didn’t just re-create Lincoln’s famous speeches, but also imagined what the man was like behind the scenes—in backroom dealings, or in contentious confrontations with his wife, Mary Todd. Like its 1939 predecessor, Young Mr. Lincoln, the film wisely limits its scope; focusing on a pivotal period proves a defter approach than trying to capture the full sprawl of a president’s life, a task better left to hefty biographies.
And then there’s a movie like this year’s Reagan, the Ronald Reagan biopic starring Dennis Quaid. Reagan is a boyhood-to-grave survey of the 40th president’s life and administration, with a chest-beating emphasis on his handling of the Cold War that blurs the line between biopic and Hollywood boosterism. Filmed with all the visual panache of an arthritis-medication commercial, the movie is suffocating in its unflagging reverence for its titular hero. In its portrayal of Reagan’s formative years, secondary characters seem to exist primarily to give mawkish pep talks or to fill the young Reagan’s brain with somber warnings about the evils of communism. “God has a purpose for your life, something only you can do,” his mother tells him after he reads scripture at church. Later, in college, he is disturbed by a speech from a Soviet defector, who visits a local congregation and lectures wide-eyed students that they will not find a “church like this” in the U.S.S.R.
Unlike Lincoln, the film seems incapable of imagining what its protagonist was like in private moments or ascribing any interior complexity to him. Even his flirty exchanges with his wife, Nancy, feel like they were cribbed from a campaign ad. “I just want to do something good in this world,” he tells his future spouse on a horseback-riding date. “Make a difference.” The portrayal isn’t helped by the fact that the 70-year-old Quaid is digitally de-aged and delivers his lines in a tinny imitation of the politician’s voice. A bizarre narrative device further detaches the audience from Reagan’s perspective: The entire movie is narrated by Jon Voight doing a Russian accent, as a fictionalized KGB agent who surveilled Reagan for decades and is now regaling a young charge with stories of how one American president outsmarted the Soviet Union.
They say history is written by the winners. But sometimes the winners like to put on a bad accent and cosplay as the losers. Yet despite heavily negative reviews, Reagan remained in theaters for nearly two months and earned a solid $30 million at the box office, playing to an underserved audience and tapping into some of the cultural backlash that powered Donald Trump’s reelection. The film’s success portends a strange new era for the presidential biopic, one in which hokey hagiography might supplant any semblance of character depth—reinforcing what audiences already want to hear about politicians they already admire.
In retrospect, Lincoln, with its innate faith in the power of government to do good, was as much a product of the “Obamacore” era—that surge of positivity and optimism that flooded pop culture beginning in the early 2010s—as Lin-Manuel Miranda’s Broadway smash Hamilton. But the arrival of the Trump era threw cold water on those feel-good vibes, and since Lincoln, presidential biopics have largely failed to connect with crowds. Two lightweight depictions of Barack Obama’s young adulthood arrived in 2016, but neither reckoned with his complicated presidency. In 2017, Rob Reiner delivered the ambivalent and uneven LBJ, which sank at the box office and made little impression on audiences. Meanwhile, Martin Scorsese developed and seemingly abandoned a Teddy Roosevelt biopic.
In development for more than a decade, Reagan emerges from a more plainly partisan perspective. Its producer, Mark Joseph, once called The Reagans, the 2003 TV movie starring James Brolin, “insulting” to the former president. Though Reagan director Sean McNamara expressed hope that his film would unite people across political lines, its source material, The Crusader, is a book by Paul Kengor, a conservative who has written eight books about Reagan and who presently works at a right-wing think tank. And its star, Dennis Quaid, is among Hollywood’s most prominent Trump supporters. In July, Quaid appeared on Fox News live from the Republican National Convention, proclaiming that Reagan would help Americans born after 1985 “get a glimpse of what this country was.”
The notable presidential biopics of the past were prestige pictures that at least tried to appeal to a wide swath of the moviegoing public, across political spectrums. Even 2008’s W., Stone’s spiritual sequel to Nixon—inferior by far, and disappointingly conventional in its biographical beats—is hardly the liberal excoriation many viewers might have expected from the director; it was even criticized for going too easy on George W. Bush. Released during the waning months of his presidency, when Bush-bashing was low-hanging fruit for audiences, the film portrays the 43rd president as a lovable screwup with crippling daddy issues. As Timothy Noah argued in Slate at the time, “W. is the rare Oliver Stone film that had to tone down the historical record because the truth was too lurid.”
Instead, new entries like Reagan and Ali Abbasi’s The Apprentice, the more nuanced film, reflect the market demands of a more fragmented moviegoing public—and reality. Rarely do two movies about the same era of American history have so little audience overlap. Set from 1973 to 1986, The Apprentice portrays Trump (Sebastian Stan) as a young sociopath-in-training, dramatizing his rise to business mogul and his relationship with mentor Roy Cohn (Jeremy Strong), a Svengali of capitalist chicanery molding a monster in his own image. In the most shocking scenes, the film depicts Trump brutally raping his wife, Ivana, and undergoing liposuction surgery. (Ivana accused Trump of rape in a 1990 divorce deposition, then recanted the allegation decades later. Trump’s campaign has called the movie a “malicious defamation.”) The film, in other words, gives confirmation—and a sleazily gripping origin story—to those who already believe Trump is a malevolent con man and irredeemable misogynist. It knows what its viewers want.
[Read: How the GOP went from Reagan to Trump]
So, seemingly, does Reagan, which shows its protagonist primarily as the Great Communicator who tore down that wall. But as the Reagan biographer Max Boot recently wrote, “the end of the Cold War and the fall of the Soviet Union were primarily the work of Soviet leader Mikhail Gorbachev—two consequences of his radically reformist policies … Reagan did not bring about Gorbachev’s reforms, much less force the collapse of the Soviet Union.” Reagan resists such nuance, hewing instead to a predictable hero’s narrative. Soviet leaders are swathed in visual clichés: grotesque men sipping vodka in cigar-filled rooms.
Meanwhile, the film renders Reagan’s domestic critics without sophistication or dignity. As Matthew Dallek chronicles in his book The Right Moment, Reagan spent much of his 1966 campaign to become California’s governor sensationalizing and condemning marches protesting the Vietnam War at UC Berkeley, and later called for a “bloodbath” against the campus left. In the film, we see Reagan, as the state’s governor, calling in the National Guard to crack down on Berkeley protesters, but we never learn what these students are protesting; Vietnam is scarcely referenced. (A nastier incident, in which Reagan-sent cops in riot gear opened fire on student protesters and killed one, goes unmentioned.)
A less slanted film might have interrogated the conflict between Reagan’s anti-totalitarian Cold War rhetoric and his crackdown on demonstrators at home. It might also have reckoned with the president’s devastating failure to confront the AIDS epidemic, a fact the movie only fleetingly references, via a few shots of ACT UP demonstrators slotted into a generic montage of Reagan critics set to Genesis’s “Land of Confusion.” But Reagan remains tethered to the great-man theory of history, in which Reagan single-handedly ended the Cold War, preserved America’s standing in the world, and beat back lefty Communist sympathizers. A match-cut transition, from a shot of newly retired Reagan swinging an axe at his ranch to young “wallpeckers” taking axes to the Berlin Wall in 1989, literalizes the message for grade-school viewers: The Gipper brought down the wall himself. It’s not that the movie is too kind to Reagan—but by flattening him in this way, it robs him of the conflicts and contradictions that made him a figure worth thinking about today.
In this way, too, Reagan forms a curious contrast to Nixon. A central message of Stone’s film is that even if Nixon had wanted to end the Vietnam War, he was powerless to act against the desires of the deep state (or “the beast,” as Hopkins’s Nixon calls it). In a defining scene, a young anti-war demonstrator confronts the president. “You can’t stop it, can you?” she realizes. “Because it’s not you. It’s the system. The system won’t let you stop it.” Nixon is stunned into stammering disbelief.
Indeed, Stone’s trilogy of films about U.S. presidents (JFK, Nixon, and W.) all reflect some paranoia about the dark forces of state power. (The unabashedly conspiratorial JFK suggests that Kennedy was eliminated by the CIA and/or the military-industrial complex because he didn’t fall in line with their covert objectives.) They are stories of ambitious leaders whose presidencies were hijacked or truncated by forces beyond their comprehension—movies whose villains are shadowy figures operating within the bowels of the U.S. government. It’s not just Stone’s view of state power that makes his films more interesting; it’s that he takes into account forces larger than one man, regardless of that man’s own accomplishments.
Reagan’s vision of the institution is more facile. Its hero is endowed with near-mythical power to end wars and solve domestic woes; its villains are as clearly labeled as a map of the Kremlin. The film’s simplistic pandering vaporizes complexity and undercuts the cinematic aims of a presidential biopic. It’s a profitable film because it instead adheres to the market incentives of modern cable news: Tell viewers what they want to hear, and give them a clear and present enemy.
In his 2011 book, The Reactionary Mind, the political theorist Corey Robin argues that the end of the Cold War had proven unkind to the conservative movement by depriving it of a distinct enemy. For today’s GOP, a good adversary is hard to find—in the past few years, its leaders have grasped around haphazardly in search of one: trans people, Haitian immigrants, childless women. (And, as always, Hillary Clinton.) In Reagan, though, the world is much simpler: There’s an evil empire 5,000 miles away, and a California cowboy is the only man who can beat it. It’s a flat narrative fit for one of his old B movies.
When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.
www.theatlantic.com › ideas › archive › 2024 › 11 › musa-al-gharbi-wokeness-elite › 680347
This story seems to be about:
In his 2023 Netflix comedy special, Selective Outrage, Chris Rock identified one of the core contradictions of the social-justice era: “Everybody’s full of shit,” Rock said, including in the category of “everybody” people who type “woke” tweets “on a phone made by child slaves.”
I was reminded of that acerbic routine while reading Musa al-Gharbi’s new book, We Have Never Been Woke. Al-Gharbi, a 41-year-old sociologist at Stony Brook University, opens with the political disillusionment he experienced when he moved from Arizona to New York. He was immediately struck by the “racialized caste system” that everyone in the big liberal city seems to take “as natural”: “You have disposable servants who will clean your house, watch your kids, walk your dogs, deliver prepared meals to you.” At the push of a button, people—mostly hugely underpaid immigrants and people of color—will do your shopping and drive you wherever you want to go.
He contrasts that with the “podunk” working-class environment he’d left behind, where “the person buying a pair of shoes and the person selling them are likely to be the same race—white—and the socioeconomic gaps between the buyer and the seller are likely to be much smaller.” He continues: “Even the most sexist or bigoted rich white person in many other contexts wouldn’t be able to exploit women and minorities at the level the typical liberal professional in a city like Seattle, San Francisco, or Chicago does in their day-to-day lives. The infrastructure simply isn’t there.” The Americans who take the most advantage of exploited workers, he argues, are the same Democratic-voting professionals in progressive bastions who most “conspicuously lament inequality.”
[Read: The blindness of elites]
Musa sees the reelection of Donald Trump as a reflection of Americans’ resentment toward elites and the “rapid shift in discourse and norms around ‘identity’ issues” that he refers to as the “Great Awokening.” To understand what’s happening to American politics, he told me, we shouldn’t look to the particulars of the election—“say, the attributes of Harris, how she ran her campaign, inflation worries, and so on,” but rather to this broader backlash. All of the signs were there for elites to see if only they’d bothered to look.
One question We Have Never Been Woke sets out to answer is why elites are so very blind, including to their own hypocrisy. The answer al-Gharbi proposes is at once devastatingly simple yet reaffirmed everywhere one turns: Fooled by superficial markers of their own identity differences—racial, sexual, and otherwise—elites fail to see themselves for what they truly are.
“When people say things about elites, they usually focus their attention on cisgender heterosexual white men” who are “able-bodied and neurotypical,” al-Gharbi told me, in one of our conversations this fall. Most elites are white, of course, but far from all. And elites today, he added, also “increasingly identify as something like disabled or neurodivergent, LGBTQ.” If you “exclude all of those people from analysis, then you’re just left with this really tiny and misleading picture of who the elites are, who benefits from the social order, how they benefit.”
Sociologists who have studied nonwhite elites in the past have tended to analyze them mainly in the contexts of the marginalized groups from which they came. E. Franklin Frazier’s 1955 classic, Black Bourgeoisie, for example, spotlighted the hypocrisy and alienation of relatively prosperous Black Americans who found themselves doubly estranged: from the white upper classes they emulated as well as from the Black communities they’d left behind. By analyzing nonwhites and other minorities as elites among their peers, al-Gharbi is doing something different. “Elites from other groups are often passed over in silence or are explicitly exempted from critique (and even celebrated!),” he writes. And yet, “behaviors, lifestyles, and relationships that are exploitative, condescending, or exclusionary do not somehow become morally noble or neutral when performed by members of historically marginalized or disadvantaged groups.”
When al-Gharbi uses the word elite, he is talking about the group to which he belongs: the “symbolic capitalists”—broadly speaking, the various winners of the knowledge economy who do not work with their hands and who produce and manipulate “data, rhetoric, social perceptions and relations, organizational structures and operations, art and entertainment, traditions and innovations.” These are the people who set the country’s norms through their dominance of the “symbolic economy,” which consists of media, academic, cultural, technological, legal, nonprofit, consulting, and financial institutions.
Although symbolic capitalists are not exactly the same as capitalist capitalists, or the rest of the upper class that does not rely on income, neither are they—as graduate students at Columbia and Yale can be so eager to suggest—“the genuinely marginalized and disadvantaged.” The theorist Richard Florida has written about a group he calls the “creative class,” which represents 30 percent of the total U.S. workforce, and which overlaps significantly with al-Gharbi’s symbolic capitalists. Using survey data from 2017, Florida calculated that members of that creative class earned twice as much over the course of the year as members of the working class—an average of $82,333 versus $41,776, respectively.
Symbolic capitalists aren’t a monolith, but it is no secret that their ruling ideology is the constellation of views and attitudes that have come to be known as “wokeness,” which al-Gharbi defines as beliefs about social justice that “inform how mainstream symbolic capitalists understand and pursue their interests—creating highly novel forms of competition and legitimation.”
Al-Gharbi’s own path is emblematic of the randomness and possibility of membership in this class. The son of military families on both sides, one Black and one white, he attended community college for six years, “taking classes off and on while working,” he told me. There he was lucky to meet a talented professor, who “basically took me under his wing and helped me do something different,” al-Gharbi said. Together, they focused on private lessons in Latin, philosophy, and classics—subjects not always emphasized in community college.
Around that time he was also going on what he calls “this whole religious journey”: “I initially tried to be a Catholic priest, and then I became an atheist for a while, but I had this problem. I rationally convinced myself that religion was bullshit and there is no God, but I couldn’t make myself feel it.” Then he read the Quran and “became convinced that it was a prophetic work. And so I was like, Well, if I believe that Muhammad is a prophet and I believe in God, that’s the two big things. So maybe I am a Muslim.” Soon after, he changed his name. Then, just when he was getting ready to transfer out of community college, his twin brother, Christian, was killed on deployment in Afghanistan. He chose to go somewhere close to his grieving family, the University of Arizona, to finish his degree in Near-Eastern studies and philosophy.
The same dispassionate analysis that he applies to his own life’s progress he brings to bear on America’s trends, especially the Great Awokening. He traces that widespread and sudden movement in attitudes not to the death of Trayvon Martin or Michael Brown, nor to Black Lives Matter or the #MeToo movement, nor to the election of Donald Trump, but to September 2011 and the Occupy Wall Street movement that emerged from the ashes of the financial crisis.
“In reality, Occupy was not class oriented,” he argues. By focusing its critique on the top 1 percent of households, which were overwhelmingly white, and ignoring the immense privilege of the more diverse symbolic capitalists just beneath them, the movement, “if anything, helped obscure important class differences and the actual causes of social stratification.” This paved the way for “elites who hail from historically underrepresented populations … to exempt themselves from responsibility for social problems and try to deflect blame onto others.”
[Read: The 9.9 percent is the new American aristocracy]
Al-Gharbi is neither an adherent of wokeism nor an anti-woke scold. He would like to both stem the progressive excesses of the summer of 2020, a moment when white liberals “tended to perceive much more racism against minorities than most minorities, themselves, reported experiencing,” and see substantive social justice be achieved for everyone, irrespective of whether they hail from a historically disadvantaged identity group or not. The first step, he argues, is to dispel the notion that the Great Awokening was “some kind of unprecedented new thing.”
Awokenings, in al-Gharbi’s telling, are struggles for power and status in which symbolic capitalists, often instinctively and even subconsciously, leverage social-justice discourse not on behalf of the marginalized but in service of their own labor security, political influence, and social prestige. He does not see this as inherently nefarious—indeed, like Tocqueville and many others before him, he recognizes that motivated self-interest can be the most powerful engine for the common good. Al-Gharbi argues that our current Awokening, which peaked in 2021 and is now winding down, is really the fourth such movement in the history of the United States.
The first coincided with the Great Depression, when suddenly “many who had taken for granted a position among the elite, who had felt more or less entitled to a secure, respected, and well-paying professional job, found themselves facing deeply uncertain futures.”
The next would take place in the 1960s, once the radicals of the ’30s were firmly ensconced within the bourgeoisie. “The driver was not the Vietnam War itself,” al-Gharbi stresses. That had been going on for years without protest. Nor was the impetus the civil-rights movement, gay liberation, women’s liberation, or any such cause. “Instead, middle-class students became radical precisely when their plans to leave the fighting to minorities and the poor by enrolling in college and waiting things out began to fall through,” he argues. “It was at that point that college students suddenly embraced anti-war activism, the Black Power movement, feminism, postcolonial struggles, gay rights, and environmentalism in immense numbers,” appropriating those causes for their own gain.
If this sounds familiar, it should. The third Awokening was smaller and shorter than the others, stretching from the late ’80s to the early ’90s, and repurposing and popularizing the Marxist term political correctness. Its main legacy was to set the stage for the fourth—and present—Awokening, which has been fueled by what the scholar Peter Turchin has termed “elite overproduction”: Quite simply, America creates too many highly educated, highly aspirational young people, and not enough high-status, well-paid jobs for them to do. The result, al-Gharbi writes, is that “frustrated symbolic capitalists and elite aspirants [seek] to indict the system that failed them—and also the elites that did manage to flourish—by attempting to align themselves with the genuinely marginalized and disadvantaged.” It is one of the better and more concise descriptions of the so-called cancel culture that has defined and bedeviled the past decade of American institutional life. (As Hannah Arendt observed in The Origins of Totalitarianism, political purges often serve as jobs programs.)
The book is a necessary corrective to the hackneyed discourse around wealth and privilege that has obtained since 2008. At the same time, al-Gharbi’s focus on symbolic capitalists leaves many levers of power unexamined. Whenever I’m in the company of capitalist capitalists, I’m reminded of the stark limitations of the symbolic variety. Think of how easily Elon Musk purchased and then destroyed that vanity fair of knowledge workers formerly known as Twitter. While some self-important clusters of them disbanded to Threads or Bluesky to post their complaints, Musk helped Trump win the election. His PAC donated $200 million to the campaign, while Musk served as Trump’s hype man at rallies and on X. Trump has since announced that Musk will be part of the administration itself, co-leading the ominously named Department of Government Efficiency.
Al-Gharbi’s four Great Awokenings framework can sometimes feel too neat. In a review of We Have Never Been Woke in The Wall Street Journal, Jonathan Marks points out a small error in the book. Al-Gharbi relies on research by Richard Freeman to prove that a bust in the labor market for college graduates ignited the second Awokening. But al-Gharbi gets the date wrong: “Freeman’s comparison isn’t between 1958 and 1974. It’s between 1968 and 1974”—too late, Marks argued, to explain what al-Gharbi wants it to explain. (When I asked al-Gharbi about this, he acknowledged the mistake on the date but insisted the point still held: “The thing that precipitated the massive unrest in the 1960s was the changing of draft laws in 1965,” he said. “A subsequent financial crisis made it tough for elites to get jobs, ramping things up further.” He argued it was all the same crisis: an expanding elite “growing concerned that the lives and livelihoods they’d taken for granted are threatened and may, in fact, be out of reach.”)
Despite such quibbles, al-Gharbi’s framework remains a powerful one. By contrasting these periods, al-Gharbi stressed to me, we can not only understand what is happening now but also get a sense of the shape of wokenesses to come. As he sees it, “the way the conversation often unfolds is just basically saying wokeness is puritanism or religion,” he explained. “They think Puritanism sucks, or religion sucks,” he continued. But just saying that “wokeness is bad” is not “super useful.”
Indeed, one of the primary reasons such anti-woke reactions feel so unsatisfactory is that wokeness, not always but consistently, stems from the basic recognition of large-scale problems that really do exist. Occupy Wall Street addressed the staggering rise of inequality in 21st-century American life; Black Lives Matter emerged in response to a spate of reprehensible police and vigilante killings that rightfully shocked the nation’s conscience; #MeToo articulated an ambient sexism that degraded women’s professional lives and made us consider subtler forms of exploitation and abuse. The self-dealing, overreach, and folly that each of these movements begat does not absolve the injustices they emerged to address. On the contrary, they make it that much more urgent to deal effectively with these ills.
[Musa al-Gharbi: Police punish the ‘good apples’]
Any critique of progressive illiberalism that positions the latter as unprecedented or monocausal—downstream of the Civil Rights Act, as some conservatives like to argue—is bound not only to misdiagnose the problem but to produce ineffective or actively counterproductive solutions to it as well. Wokeness is, for al-Gharbi, simply the way in which a specific substratum of elites “engage in power struggles and struggles for status,” he said. “Repealing the Civil Rights Act or dismantling DEI or rolling back Title IX and all of that will not really eliminate wokeness.”
Neither will insisting that its adherents must necessarily operate from a place of bad faith. In fact, al-Gharbi believes it is the very sincerity of their belief in social justice that keeps symbolic capitalists from understanding their own behavior, and the counterproductive social role they often play. “It’s absolutely possible for someone to sincerely believe something,” al-Gharbi stressed, “but also use it in this instrumental way.”
Having been born into one minority group and converted to another as an adult, al-Gharbi has himself accrued academic pedigree and risen to prominence, in no small part, by critiquing his contemporaries who flourished during the last Great Awokening. He is attempting to outflank them, too, aligning himself even more fully with the have-nots. Yet his work is permeated by a refreshing awareness of these facts. “A core argument of this book is that wokeness has become a key source of cultural capital among contemporary elites—especially among symbolic capitalists,” he concedes. “I am, myself, a symbolic capitalist.”
The educated knowledge workers who populate the Democratic Party need more of this kind of clarity and introspection. Consider recent reports that the Harris campaign declined to appear on Joe Rogan’s podcast in part out of concerns that it would upset progressive staffers, who fussed over language and minuscule infractions while the country lurched toward authoritarianism.
Al-Gharbi’s book’s title is drawn from Bruno Latour’s We Have Never Been Modern, which famously argued for a “symmetrical anthropology” that would allow researchers to turn the lens of inquiry upon themselves, subjecting modern man to the same level of analytical rigor that his “primitive” and premodern counterparts received. What is crucial, al-Gharbi insists, “is not what’s in people’s hearts and minds.” Rather the question must always be: “How is society arranged?” To understand the inequality that plagues us—and then to actually do something about it—we are going to have to factor in ourselves, our allies, and our preferred narratives too. Until that day, as the saying about communism goes, real wokeness has never even been tried.
www.theatlantic.com › science › archive › 2024 › 11 › trump-cop-china-climate › 680611
This story seems to be about:
In what will probably be the warmest year in recorded history, in a month in which all but two U.S. states are in a drought, and on a day when yet another hurricane was forming in the Caribbean, Donald Trump, a climate denier with a thirst for oil drilling, won the American presidency for a second time. And today, delegates from around the world will begin this year’s global UN climate talks, in Baku, Azerbaijan. This UN Conference of Parties (COP) is meant to decide how much money wealthy, high-emitting nations should channel toward the poorer countries that didn’t cause the warming in the first place, but the Americans—representing the country that currently has the second-highest emissions and is by far the highest historical emitter—now can make no promises that anyone should believe they would keep.
“We know perfectly well [Trump] won’t give another penny to climate finance, and that will neutralize whatever is agreed,” Joanna Depledge, a fellow at the University of Cambridge and an expert on international climate negotiations, told me. Without about a trillion dollars a year in assistance, developing nations’ green transitions will not happen fast enough to prevent catastrophic global warming. But wealthy donor countries are more likely to contribute if others do, and if the U.S. isn’t paying in, other large emitters have cover to weaken their own climate-finance commitments.
In an ironic twist for a president-elect who likes to villainize China, Trump may be handing that nation a golden opportunity. China has, historically, worked to block ambitious climate deals, but whoever manages to sort out the question of global climate finance will be lauded as a hero. With the U.S. stepping out of a climate-leadership role, China has the chance—and a few good reasons—to step in and assume it.
The spotlight in Baku will now be on China as the world’s biggest emitter, whether the country likes it or not, Li Shuo, a director at the Asia Society Policy Institute, said in a press call. The Biden administration did manage to nudge China to be more ambitious in some of its climate goals, leading, for example, to a pledge to reduce methane emissions. But the Trump administration will likely shelve ongoing U.S.-China climate conversations and remove, for a second time, the U.S. from the Paris Agreement, which requires participants to commit to specific emissions-reduction goals. Last time around, Trump’s withdrawal made China look good by comparison, without the country necessarily needing to change course or account for its obvious problem areas, like its expanding coal industry. The same will likely happen again, Alex Wang, a law professor at UCLA and an expert on U.S.-China relations, told me.
China is, after all, the leading producer and installer of green energy, but green energy alone is not enough to avoid perilous levels of warming. China likes to emphasize that it’s categorized as a developing country at these gatherings, and has fought deals that would require it to limit emissions or fork over cash, and by extension, limit its growth. But with the U.S. poised to do nothing constructive, China’s position on climate looks rosy in comparison.
[Read: A tiny petrostate is running the world’s climate talks]
By cutting off its contributions to international climate finance, the U.S. also will give China more room to expand its influence through “green soft power.” China has spent the past five years or so focused on the construction of green infrastructure in Africa, Latin America, and Southeast Asia, Wang said. Tong Zhao, senior fellow at the Carnegie Endowment for International Peace, told Reuters that China expects to be able to “expand its influence in emerging power vacuums” under a second Trump term. Under Biden, the U.S. was attempting to compete in the green-soft-power arena by setting up programs to help clean-energy transitions in Indonesia or Vietnam, Wang noted. “But now I suspect that those federal efforts will be eliminated.”
[Read: Why Xi wants Trump to win]
Most experts now view the global turn toward solar and other clean energy as self-propelled and inevitable. When Trump first entered office, solar panels and electric vehicles were not hot topics. “Eight years later, it is absolutely clear that China dominates in those areas,” Wang said. China used the first Trump administration to become the biggest clean-tech supplier in the world, by far. The Biden administration tried to catch up in climate tech, primarily through the Inflation Reduction Act, but even now, Shuo told me, Chinese leaders do not see the U.S. as a clean-tech competitor. “They have not seen the first U.S.-made EV or solar panel installed in Indonesia, right?” he said. “And of course, the U.S. lagging behind might be exacerbated by the Trump administration,” which has promised to repeal the IRA, leaving potentially $80 billion of would-be clean-tech business for other countries—but most prominently China—to scoop up. In all international climate arenas, the U.S. is poised to mostly hurt itself.
[Read: How Trump's America will lose the climate race]
More practically, Baku could give China a chance to negotiate favorable trade deals with the EU, which has just started to impose new carbon-based border tariffs. But none of this guarantees that China will decide to take a decisive role in negotiating a strong climate-finance deal. Climate finance is what could keep the world from tipping into darker and wholly avoidable climate scenarios. But news of Trump’s election is likely to lend COP the air of a collective hangover. EU countries will surely assume a strong leadership posture in the talks, but they don’t have the fiscal or political might to fill the hole the U.S. will leave behind. Without surprise commitments from China and other historically begrudgingly cooperative countries, COP could simply fail to deliver a finance deal, or, more likely, turn out a miserably weak one.
The global climate community has been here before, though. The U.S. has a pattern of obstructing the climate negotiations. In 1992, the Rio Treaty was made entirely voluntary at the insistence of President George H. W. Bush. In 1997, the Clinton-Gore administration had no strategy to get the Kyoto Protocol ratified in the Senate; the U.S. has still never ratified it.
But although President George W. Bush’s administration declared Kyoto dead, it in fact laid the groundwork for the Paris Agreement. The Paris Agreement survived the first Trump term and will survive another, Tina Stege, the climate envoy for the Marshall Islands, told me. The last time Trump was elected, the EU, China, and Canada put out a joint negotiating platform to carry on climate discussions without the United States. That largely came to nothing, but the coalition will now have a second chance. And overemphasizing U.S. politics, Stege said, ignores that countries like hers are pressing on with diplomatic agreements that will determine their territories' survival.
Nor is the U.S. defined only by its federal government. Subnationally, a number of organizations cropped up in the U.S. during Trump’s first administration to mobilize governors, mayors, and CEOs to step in on climate diplomacy. These include the U.S. Climate Alliance (a bipartisan coalition of 24 governors) and America Is All In: a coalition of 5,000 mayors, college presidents, health-care executives, and faith leaders, co-chaired by Washington State Governor Jay Inslee and former EPA Administrator Gina McCarthy, among other climate heavy hitters. This time, they won’t be starting from scratch in convincing the rest of the world that at least parts of the U.S. are still committed to fighting climate change.
www.theatlantic.com › international › archive › 2024 › 11 › us-world-power-over-election › 680595
This story seems to be about:
Americans voted for change in this week’s presidential election, and in foreign policy, they’ll certainly get it. Donald Trump has shown disdain for the priorities and precedents that have traditionally guided Washington’s approach to the world. He speaks more fondly of America’s autocratic adversaries than of its democratic allies. He derides “globalism” as a liberal conspiracy against the American people. And he treats international agreements as little more than wastepaper.
At stake is not only the survival of Ukraine and the fate of Gaza, but the entire international system that forms the foundation of American global power. That system is built upon American military might, but more than that, it is rooted in relationships and ideals—nations with shared values coming together under U.S. leadership to deter authoritarian aggression and uphold democracy. The resulting world order may be badly flawed and prone to error, but it has also generally preserved global stability since the end of World War II.
Despite its endurance, this system is fragile. It is sustained by an American promise to hold firm to its commitments and ensure collective defense. Trump threatens that promise. His plan to impose high tariffs on all imports could disrupt the liberal economic order on which many American factories and farmers (and Trump’s billionaire buddies) rely. His apparent willingness to sacrifice Ukraine to Russian President Vladimir Putin in some misguided pursuit of peace will strain the Atlantic alliance and undermine security in Europe. By signaling that he won’t defend Taiwan from a Chinese invasion, he could undercut confidence in the United States throughout Asia and make a regional war more likely.
[From the December 2024 issue: My hope for Palestine]
The American global order could end. This would not be a matter of “American decline.” The U.S. economy will likely remain the world’s largest and most important for the foreseeable future. But if Washington breaks its promises, or even if its allies and enemies believe it has or will—or if it fails to uphold democracy and rule of law at home—the pillars of the American international system will collapse, and the United States will suffer an immeasurable loss of global influence and prestige.
The risk that this will happen has been gathering for some time. George W. Bush’s unilateralist War on Terror strained the international system. So did Trump’s disputes with NATO and other close allies during his first term. But world leaders could write off Washington’s wavering as temporary deviations from what has been a relatively consistent approach to foreign policy over decades. They understand the changeability of American politics. In four years, there will be another election and a new administration may restore Washington’s usual priorities.
With Trump’s reelection, however, the aberration has become the new normal. The American people have told the world that they no longer wish to support an American-led world order. They have chosen U.S. policy makers who promise to focus on the home front instead of on the troubles of ungrateful allies. Maybe they’ve concluded that the United States has expended too many lives and too much money on fruitless foreign adventures, such as those in Vietnam and Afghanistan. And maybe now America will reassess its priorities in light of new threats, most of all China, and the potential burden of meeting them.
The problem is that if the United States won’t lead the world, some other country will, and a number are already applying for the job. One is Putin’s Russia. Another is the China of Xi Jinping.
China began to assert its global leadership more aggressively during Trump’s first term and has worked ever harder to undermine the American system since—strengthening China’s ties with Russia and other authoritarian states, building a coalition to counterbalance the West, and promoting illiberal principles for a reformed world order. Trump seems to believe that he can keep China in check with his personal charm alone. When asked in a recent interview whether he would intervene militarily if Xi blockaded Taiwan, he responded, “I wouldn’t have to, because he respects me.”
That’s narcissism, not deterrence. More likely, Putin and Xi will take advantage of Trump’s disinterest. Once appeased in Ukraine, Putin may very well rebuild his army with the help of China, North Korea, and Iran, and then move on to his next victim—say, Georgia or Poland. Xi could be emboldened to invade Taiwan, or at least spark a crisis over the island to extract concessions from a U.S. president who has already suggested that he won’t fight.
[Read: The case for treating Trump like a normal president]
The result will be not merely a multipolar world. That’s inevitable, whatever Washington does. It will be a global order in which autocrats prey on smaller states that can no longer count on the support of the world’s superpower, regional rivalries erupt into conflict, economic nationalism subverts global trade, and new nuclear threats emerge. This world will not be safe for American democracy or prosperity.
The fate of the world order and U.S. global power may seem of little consequence to Americans struggling to pay their bills. But a world hostile to U.S. interests will constrain American companies, roil international energy markets, and endanger jobs and economic growth. Americans could confront bigger wars that require greater sacrifices (as in 1941).
Perhaps Trump will surprise everyone by pondering his legacy and choosing not to pursue the course he has signaled. But that seems unlikely. His messaging on his foreign-policy priorities has been too consistent for too long. Over the next four years, Americans will have to decide whether they still want the United States to be a great power, and if so, what kind of great power they wish it to be. Americans wanted change. The world may pay the price.
www.theatlantic.com › ideas › archive › 2024 › 11 › democracy-acemoglu-nobel-prize › 680522
This story seems to be about:
Last month, the 2024 Nobel Prize in economics was awarded to three scholars, Daron Acemoglu, Simon Johnson, and James Robinson, for “studies of how institutions are formed and affect prosperity.” Don’t let the dry language fool you: The award generated more controversy than any other in recent memory. One critic wrote that “it exposes how the economics discipline fails to ask critical questions.” Another called the trio’s work “historically inaccurate, if not ideologized.” I confess to emailing a long rant to colleagues in my department.
Why so much drama? In the work that won the Nobel, Acemoglu, Johnson, and Robinson—three highly respected economists almost universally referred to as “AJR” by their peers—argue that nations with democratic institutions have the most economic growth. It’s an appealing thesis that reaffirms the political systems of Western democracies. The problem—not just for the theory, but for the resilience of those political systems—is that it simply isn’t true.
According to AJR, certain kinds of economic institutions—private property, freedom of contract, a strong and impartial legal system, the freedom to form new businesses—are “inclusive,” meaning they allow most people to freely participate in the economy in a way that makes the best use of their skills. These institutions create modern, wealthy economies that are driven forward by technological innovation, not merely propped up by having won the natural-resources lottery.
Economic institutions do not just drop out of the sky. The laws and systems that make an economy run, such as market regulations, must be created and maintained by governments. AJR therefore argue that political institutions dictate economic outcomes. A nation’s political institutions are what determine who can rule, how these rulers are chosen, and how power is distributed across government. The institutions are enshrined in constitutions, electoral rules, and even traditions. In a dictatorship or monarchy, political power is narrowly distributed and relatively unconstrained. In such cases, AJR argue, a small ruling class will tend to use its power to restrict competition and extract wealth for itself. By contrast, if political power is widely distributed across diverse groups in a society, then their common interest in doing business, and competition among them, will result in prosperity-generating economic institutions.
In the United States, for instance, the oil industry would prefer not to have to compete with alternative energy. Old-stock citizens don’t want to compete with recent immigrants. But if the government is given the power to exclude my rivals, that same power could potentially be used against me. So, according to AJR’s thesis, our common desire to maximize our individual liberties and protect our property prevents us from ceding such power to the government. We’d rather keep the competition than risk getting shut out of the game. And that competition forces us to be ever more efficient.
In the end, AJR claim, decentralized democratic systems such as those found in the U.S., Germany, and Switzerland foster economic prosperity by improving a nation’s ability to innovate. Democracies with more centralized power are less productive. (Think of countries such as France, Portugal, and Greece, which have less separation of powers, fewer checks and balances, and relatively weak state and local governments.) Finally, one-party states and authoritarian regimes—those with even more centralized and less competitive political systems—breed stagnation.
[Gisela Salim-Peyer: Why does anyone care about the Nobel Prize?]
The most common critique of the AJR thesis hinges on methodological objections to the way in which they collected and analyzed the data. But you do not need a degree in economics or statistics to be skeptical of their argument. The real world simply provides too many counterexamples.
On the one hand, there are the nondemocratic systems whose economies have done quite well. To take the most obvious, China’s economy has grown fantastically since the late 1970s even as its political system has remained autocratic and repressive. It is now a global competitor or leader in electric vehicles, solar panels, quantum computing, biotechnology, mobile payment systems, artificial intelligence, 5G cellular, and nuclear-fusion research. South Korea and Taiwan are democracies now, but each grew from an agricultural backwater into a technological powerhouse from the ’60s to the ’90s, while they were relatively authoritarian states. The same goes for Japan. After World War II, Japan’s wealth grew dramatically as the country became a world leader in technology and manufacturing. At the time, it was a one-party state, with heavy government intervention in the economy. Elections were held regularly, but only the Liberal Democratic Party won. The party therefore controlled almost every major political institution throughout the country for nearly four decades, resulting in infamous, widespread corruption. Still, the economy flourished.
On the other hand, examples abound of decentralized democracies that don’t experience the kind of knowledge- and technology-driven growth that AJR celebrate. Canada is a relatively wealthy nation, but over the past several decades, without any change to its institutions, it has fallen into a prolonged productivity slump. AJR consider Australia, another decentralized democracy, to be an example of their theory in action, but its wealth comes overwhelmingly from natural resources; it is not particularly good at cutting-edge science and technology. Nor has decentralized democracy propelled Argentina, Brazil, Mexico, and India into the economic stratosphere. Great Britain was the world’s wealthiest and most technologically advanced high-tech superpower during the 18th and 19th centuries, but it lost that position even as its political institutions became more democratic and distributed.
A particularly revealing example is Spain. Spain has been institutionally transformed since 1975. It went from 40 years of military dictatorship to a market-oriented, decentralized democracy. Despite this revolution, there has been little relative change in Spain’s national innovation rate. Its economic growth per capita has averaged less than 1.5 percent per year since the dictatorship ended. That’s far worse than Ireland (a unitary democracy), Singapore (a “partly free” one-party democracy, according to Freedom House), and Vietnam (a communist dictatorship), each of which leveled up its economic competitiveness during the same time period.
AJR have responded to these critiques, often in point-by-point fashion. They have argued, for example, that China’s growth will be unsustainable in the long term if its political institutions don’t change. India, meanwhile, might be a decentralized democracy, but it is “highly patrimonial,” which “militates against the provision of public goods.” Arguments such as these are unfalsifiable: One can always explain away seemingly inconsistent results by pointing to some overlooked characteristic of a given country. And if a culture of patrimony has defeated otherwise-healthy institutions in India, then maybe institutions aren’t so fundamental after all.
If political institutions don’t explain economic outcomes, what does? My colleagues who study innovation and economic growth point to many possible answers, including culture, ideology, individual leadership, and geography. In my own research, I have found that countries with technology-driven, high-growth economies share one thing in common: a powerful sense of external threat. Some fear invasion; others worry about being cut off from a vital economic input, such as energy, food, and investment capital. Taiwan’s very existence is threatened by China, as was Israel’s after the Arab states began to unite around it. These two countries needed to build high-tech defense industries at home and earn money to purchase advanced weapons systems produced abroad.
Stagnant economies, by contrast, tend to be more focused on internal divisions. They are less concerned about military attacks or imports of essential goods and suffer more from deep conflicts over class, race, geography, or religion. If a nation’s internal threats are perceived to outweigh its external threats, then its people will fear the costs, risks, and redistribution of building a competitive economy. The pie is safe, so they fight over the relative size of their slice. (If I’m right, then Donald Trump’s focus on persecuting “the enemy from within” bodes ill for America’s economic prospects should he be reelected this week.) But if a nation’s sense of external threat is greater than its sense of internal threat, then it tends to invest in innovation. The costs and risks are worth it. The pie is in danger, so people cooperate to defend it.The question of why some countries thrive and others stagnate isn’t just an academic one. In the late 1980s and early ’90s, under pressure from the West, Russia created a parliament and privatized state assets; the West declared victory. The U.S. invaded Iraq in 2003, set up a parliament and a stock market, and then announced “Mission accomplished.” The promised economic miracles never materialized.
[Brian Klaas: The dictator myth that refuses to die]
Western democracies too often assume that setting up the right institutions will bring about the desired outcomes. It doesn’t. A society uses institutions to accomplish its goals. But the goals come first, and they are determined by fights over wealth, power, resources, ideas, identities, culture, history, race, religion, and status.
How did such a demonstrably incorrect thesis win the Nobel Prize? In part, I suspect, because it tells the story that many educated Westerners want to hear about democracy. We want to live in a world where democracy cures all. But if democratic nations overpromise what democracy can achieve, they risk delegitimizing it. In Russia, China, and much of the Middle East, democracy is widely seen as dysfunctional, partly because it has not delivered the promised economic prosperity.
Ambitious politicians recognize that institutions are tools, not causal forces. In different hands, the same tools will achieve different ends. Men such as Hitler and Mussolini understood this. They exploited fundamental political divides to undermine both democracy and markets. These lessons are not lost on modern leaders such as Xi Jinping, Vladimir Putin, Recep Tayyip Erdoğan, Viktor Orbán, Narendra Modi, and even some here in the United States. We therefore ignore the more fundamental political forces at our peril. So do Nobel Prize winners.