Itemoids

Great Depression

The Celebrity Look-Alike Contest Boom

The Atlantic

www.theatlantic.com › technology › archive › 2024 › 11 › celebrity-look-alike-contest-boom › 680742

The fad began with a Timothée Chalamet look-alike contest in New York City on a beautiful day last month. Thousands of people came and caused a ruckus. At least one of the Timothées was among the four people arrested by New York City police. Eventually, the real Timothée Chalamet showed up to take pictures with fans. The event, which was organized by a popular YouTuber who had recently received some attention for eating a tub of cheeseballs in a public park, captured lightning in a bottle. It didn’t even matter that the winner didn’t look much like the actor, or that the prize was only $50.

In the weeks since, similar look-alike contests have sprung up all over the country, organized by different people for their own strange reasons. There was a Zayn Malik look-alike contest in Brooklyn, a Dev Patel look-alike contest in San Francisco, and a particularly rowdy Jeremy Allen White look-alike contest in Chicago. Harry Styles look-alikes gathered in London, Paul Mescal look-alikes in Dublin. Zendaya look-alikes competed in Oakland, and a “Zendaya’s two co-stars from Challengers” lookalike contest will be held in Los Angeles on Sunday. As I write this, I have been alerted to plans for a Jack Schlossberg look-alike contest to be held in Washington, D.C., the same day. (Schlossberg is John F. Kennedy’s only grandson; he both works at Vogue and was also profiled by Vogue this year.)

These contests evidently provide some thrill that people are finding irresistible at this specific moment in time. What is it? The chance to win some viral fame or even just positive online attention is surely part of it, but those returns are diminishing. The more contests there are, the less novel each one is, and the less likely it is to be worth the hassle. That Chalamet showed up to his look-alike contest was magic—he’s also the only celebrity to attend one of these contests so far. Yet the contests continue.

Celebrities have a mystical quality that’s undeniable, and it is okay to want to be in touch with the sublime. Still, some observers sense something a bit sinister behind the playfulness of contest after contest, advertised with poster after poster on telephone pole after telephone pole. The playwright Jeremy O. Harris wrote on X that the contests are “Great Depression era coded,”  seeming to note desperation and a certain manic optimism in these events. The comparison is not quite right—although the people at these contests may not all have jobs, they don’t seem to be starving (one of the contests promised only two packs of cigarettes and a MetroCard as a prize)—but I understand what he’s getting at. Clearly, the look-alike competitions do not exist in a vacuum.

The startling multiplication of the contests reminds me of the summer of 2020, when otherwise rational-seeming people suggested that the FBI was planting caches of fireworks in various American cities as part of a convoluted psyop. There were just too many fireworks going off for anything else to make sense! So people said. With hindsight, it’s easy to recognize that theory as an expression of extreme anxiety brought on by the early months of the coronavirus pandemic. At the time, some were also feeling heightened distrust of law enforcement, which had in some places reacted to Black Lives Matter protests with violence.

Today’s internet-y stunts are just silly events, but people are looking for greater meaning in them. Over the past few weeks, although some have grown a bit weary of the contests, a consensus has also formed that they are net good because they are bringing people out of their house and into “third spaces” (public parks) and fraternity (“THE PEOPLE LONG FOR COMMUNITY”). This too carries a whiff of desperation, as though people are intentionally putting on a brave face and shoving forward symbols of our collective creativity and togetherness.

I think the reason is obvious. The look-alike contests, notably, started at the end of October. The first one took place on the same day as a Donald Trump campaign event at Madison Square Garden, which featured many gleefully racist speeches and was reasonably compared by many to a Nazi rally. The photos from the contests maybe serve as small reassurance that cities, many of which shifted dramatically rightward in the recent presidential election, are still the places that we want to believe they are—the closest approximation of America’s utopian experiment, where people of all different origins and experiences live together in relative peace and harmony and, importantly, good fun. At least most of the time.

Cher Has No Time for Nostalgia

The Atlantic

www.theatlantic.com › culture › archive › 2024 › 11 › cher-memoir-review › 680726

File this under something that should have been self-evident: When it came time for the artist known as Cher to finish her memoir, she discovered she had too much material. Where to even begin? Decades before Madonna had reinventions and Taylor Swift had eras, Cher had comebacks—triumphs over decline in which she’d reemerge stronger, shinier, and more resolute than ever. “It’s a thousand times harder to come back than to become,” she writes in the first volume of her autobiography, titled—naturally—Cher. And yet something in her soul seems to always relish the challenge. A walking, singing eye roll, Cher has never met an obstacle without theatrically raising a middle finger. Consider the gown she wore to present at the Academy Awards in 1986 after having been snubbed for her performance in Peter Bogdanovich’s Mask: the cobwebbed, midsection-baring, black sequined supervillainess outfit that became known as her fuck the Oscars dress. Radiantly moody, she glowered her way right into awards-show history.

But much of that later timeline is for the second volume, supposedly arriving next year. Cher, which documents the four decades between her birth, in 1946, and the start of her serious acting career, in 1980, is concerned with the essentials: where she came from, who she is, all the incidents that helped her become one of music’s most indelible mononyms. I guarantee that, as you read, you’ll be able to conjure the sound of her voice in your mind, velvety and sonorous. (“You couldn't tell who was singing the baritone parts,” The New York Times noted in 1988 about “I Got You, Babe,” her duet with Sonny Bono, “but you had the disturbing feeling that it probably wasn't Sonny.”) And likely her face, too: her doll-like features, sphinxlike smile, and black, black hair. More than anything, though, Cher has come to stand for a brassy, strutting kind of survival over the years, and on this front, her memoir is awash in insight and rich in details.

Cher is a bracing read, peppered with caustic quips and self-effacing anecdotes, but fundamentally frank. This, you might agree, is no moment for nostalgia. (She does not—forgive the cheap gag—actually want to turn back time.) “Ours was a sad, strange story of Southern folk coming from nothing and carving out a life after the Great Depression,” Cher writes. “It wasn’t pretty and it was never easy … Resilience is in my DNA.” Her grandmother was 12 years old when she became pregnant with Cher’s mother, Jackie Jean; her grandfather Roy was a baker’s assistant turned bootlegger who beat his new wife, made his daughter sing for pennies on top of the bars he’d drink at, and once tried to murder both his children by leaving the gas stove on. For much of Cher’s infancy—she was born Cheryl Sarkisian but changed her name in 1978—she was raised by nuns, after her father abandoned her 20-year-old mother. Later, her mother, who had a muted acting career, cycled through seven or eight husbands and two illegal abortions that almost killed her. Although Jackie was a talented performer and luminously beautiful, “my mom missed out on several major acting roles because she refused to sleep with men who promised her a break,” Cher notes. The stepfather who was kindest to young Cher was also a nasty drunk, to the point where, even now, “I still can’t stand the sound of a belt coming out of pant loops.”

From early childhood, Cher was a dynamo—singing perpetually into a hairbrush, dancing around the house, and peeing her pants during a screening of Dumbo rather than miss any of the movie. She dreamed of being a star, and, less conventionally, of discovering a cure for polio. (“When Jonas Salk invented a vaccine, I was so pissed off,” she writes.) Because of her mother’s erratic relationships, she moved constantly, all over the country. By 15, she was living in Los Angeles, where she recounts being leered at by Telly Savalas in a photographer’s studio and spending a wild night or two with Warren Beatty. At 16, she met the man who’d become her partner in all senses of the word: a divorced, charming, slightly squirelly 27-year-old named Sonny Bono. “He liked that I was quirky and nonjudgmental,” Cher writes. “I liked that he was funny and different. He was a grown-up without being too grown up, and I was a sixteen-year-old lying about my age.” Their relationship was platonic at first—when she found herself homeless, she moved in with him, the pair sleeping in twin beds next to each other like characters in a 1950s sitcom. One day, he kissed her, and that was that.

If Cher’s early life is a Steinbeckian saga of grim endurance, her life with Bono is a volatile scrapbook of life in 20th-century entertainment. Thanks to Bono’s connections with Phil Spector, she became a singer, performing backing vocals on the Righteous Brothers’ “You’ve Lost That Loving Feeling.” When Cher and Bono formed a duo and became wildly famous in 1965 with “I Got You, Babe,” the American musical establishment initially deemed her too outré in her bell bottoms and furs, and then—as the sexual revolution and rock music caught fire—too square. In her first flush of fame, the recently widowed Jackie Kennedy requested that Sonny & Cher perform for a private dinner party in New York. The fashion editor Diana Vreeland had Cher photographed for Vogue. At a party in his hotel suite, Salvador Dalí explained to her that an ornamental fish she was admiring was actually a vibrator. (“I couldn’t drop that fish fast enough.”) Having entrusted all the financial details of their partnership to Bono, she was stunned when he revealed that they owed hundreds of thousands in back taxes, right as their musical success was stalling.

[Read: What Madonna knows]

“Remembrance of things past is not necessarily the remembrance of things as they were,” Marcel Proust declared in In Search of Lost Time. Show-business memoirs can be gritty—Al Pacino’s Sonny Boy recounts a similarly bleak childhood—but I’m hard pressed to think of another celebrity author so insistent on dispensing with rose-tinted reminiscences. Cher wants you to know that for most people—and absolutely for most women—the 20th century was no cakewalk. She loved Bono, and is the first to admit how enchanting their dynamic could be. But the partner she describes was controlling, vengeful (he reportedly burned her tennis clothes after he saw her talking to another man), and shockingly callous. When she left him, she discovered that her contract was one of “involuntary servitude”—he owned 95 percent of a company called Cher Enterprises, of which she was an employee who never received a paycheck. (His lawyer owned the other 5 percent.) Their divorce was finalized in 1975, a year or so after women were granted the right to apply for credit cards in their own names.

Promoting her book, Cher told CBS Sunday Morning, “I didn’t want to give information, ’cause you could go to Wikipedia [for that]. I just wanted to tell stories.” And she does, but in a form that can’t help doubling as a broader history—an account of all the things women have suffered through (casting couches, financial ruin, humiliating public scrutiny) and fought for (authority over their own bodies). Unlike her mother, Cher was, via carefully coded language, offered a legal abortion in her doctor’s office in 1975, during a period when her life was in flux. (Her second husband, the musician Gregory Allman, was addicted to heroin and had deserted her; she was about to return to work on her CBS variety show, also titled Cher.) “I needed to be at work on Monday,” she remembers. “I needed to be singing and dancing. I had a child, mother, and sister to take care of. I knew I had to make a choice, and I knew what it was. It made it harder that I didn’t have Gregory to talk to about it, but I made my decision and I was so grateful to my doctor’s compassion for giving me one.” (Cher and Bono's son, Chaz Bono, had been born in 1969. By 1976, Cher and Allman had reconciled, and Cher gave birth to Elijah Blue Allman.)

Gratitude. Compassion. Choice. What is resilience reliant on if not all three? We have to wait for book two for Cher’s account of her ups and downs in the ’80s and ’90s—her new acting career, her Best Actress Oscar for Moonstruck, her turn to infomercials for income after a severe bout of chronic fatigue syndrome, her auto-tuned path with “Believe” to one of the best-selling pop singles of all time. But in Cher, she offers a persuasive, wry, rousing account of what made her, and what she was able to make in turn. “I’ve always thought that whether you get a break or not is purely down to luck,” she writes, adding, “These were the key moments that changed my luck.” But that read of things understates her sheer force of will—her outright refusal, as with the Oscars dress, to ever be counted out.

Is Wokeness One Big Power Grab?

The Atlantic

www.theatlantic.com › ideas › archive › 2024 › 11 › musa-al-gharbi-wokeness-elite › 680347

In his 2023 Netflix comedy special, Selective Outrage, Chris Rock identified one of the core contradictions of the social-justice era: “Everybody’s full of shit,” Rock said, including in the category of “everybody” people who type “woke” tweets “on a phone made by child slaves.”

I was reminded of that acerbic routine while reading Musa al-Gharbi’s new book, We Have Never Been Woke. Al-Gharbi, a 41-year-old sociologist at Stony Brook University, opens with the political disillusionment he experienced when he moved from Arizona to New York. He was immediately struck by the “racialized caste system” that everyone in the big liberal city seems to take “as natural”: “You have disposable servants who will clean your house, watch your kids, walk your dogs, deliver prepared meals to you.” At the push of a button, people—mostly hugely underpaid immigrants and people of color—will do your shopping and drive you wherever you want to go.

He contrasts that with the “podunk” working-class environment he’d left behind, where “the person buying a pair of shoes and the person selling them are likely to be the same race—white—and the socioeconomic gaps between the buyer and the seller are likely to be much smaller.” He continues: “Even the most sexist or bigoted rich white person in many other contexts wouldn’t be able to exploit women and minorities at the level the typical liberal professional in a city like Seattle, San Francisco, or Chicago does in their day-to-day lives. The infrastructure simply isn’t there.” The Americans who take the most advantage of exploited workers, he argues, are the same Democratic-voting professionals in progressive bastions who most “conspicuously lament inequality.”

[Read: The blindness of elites]

Musa sees the reelection of Donald Trump as a reflection of Americans’ resentment toward elites and the “rapid shift in discourse and norms around ‘identity’ issues” that he refers to as the “Great Awokening.” To understand what’s happening to American politics, he told me, we shouldn’t look to the particulars of the election—“say, the attributes of Harris, how she ran her campaign, inflation worries, and so on,” but rather to this broader backlash. All of the signs were there for elites to see if only they’d bothered to look.

One question We Have Never Been Woke sets out to answer is why elites are so very blind, including to their own hypocrisy. The answer al-Gharbi proposes is at once devastatingly simple yet reaffirmed everywhere one turns: Fooled by superficial markers of their own identity differences—racial, sexual, and otherwise—elites fail to see themselves for what they truly are.

“When people say things about elites, they usually focus their attention on cisgender heterosexual white men” who are “able-bodied and neurotypical,” al-Gharbi told me, in one of our conversations this fall. Most elites are white, of course, but far from all. And elites today, he added, also “increasingly identify as something like disabled or neurodivergent, LGBTQ.” If you “exclude all of those people from analysis, then you’re just left with this really tiny and misleading picture of who the elites are, who benefits from the social order, how they benefit.”

Sociologists who have studied nonwhite elites in the past have tended to analyze them mainly in the contexts of the marginalized groups from which they came. E. Franklin Frazier’s 1955 classic, Black Bourgeoisie, for example, spotlighted the hypocrisy and alienation of relatively prosperous Black Americans who found themselves doubly estranged: from the white upper classes they emulated as well as from the Black communities they’d left behind. By analyzing nonwhites and other minorities as elites among their peers, al-Gharbi is doing something different. “Elites from other groups are often passed over in silence or are explicitly exempted from critique (and even celebrated!),” he writes. And yet, “behaviors, lifestyles, and relationships that are exploitative, condescending, or exclusionary do not somehow become morally noble or neutral when performed by members of historically marginalized or disadvantaged groups.”

When al-Gharbi uses the word elite, he is talking about the group to which he belongs: the “symbolic capitalists”—broadly speaking, the various winners of the knowledge economy who do not work with their hands and who produce and manipulate “data, rhetoric, social perceptions and relations, organizational structures and operations, art and entertainment, traditions and innovations.” These are the people who set the country’s norms through their dominance of the “symbolic economy,” which consists of media, academic, cultural, technological, legal, nonprofit, consulting, and financial institutions.  

Although symbolic capitalists are not exactly the same as capitalist capitalists, or the rest of the upper class that does not rely on income, neither are they—as graduate students at Columbia and Yale can be so eager to suggest—“the genuinely marginalized and disadvantaged.” The theorist Richard Florida has written about a group he calls the “creative class,” which represents 30 percent of the total U.S. workforce, and which overlaps significantly with al-Gharbi’s symbolic capitalists. Using survey data from 2017, Florida calculated that members of that creative class earned twice as much over the course of the year as members of the working class—an average of $82,333 versus $41,776, respectively.

Symbolic capitalists aren’t a monolith, but it is no secret that their ruling ideology is the constellation of views and attitudes that have come to be known as “wokeness,” which al-Gharbi defines as beliefs about social justice that “inform how mainstream symbolic capitalists understand and pursue their interests—creating highly novel forms of competition and legitimation.”

Al-Gharbi’s own path is emblematic of the randomness and possibility of membership in this class. The son of military families on both sides, one Black and one white, he attended community college for six years, “taking classes off and on while working,” he told me. There he was lucky to meet a talented professor, who “basically took me under his wing and helped me do something different,” al-Gharbi said. Together, they focused on private lessons in Latin, philosophy, and classics—subjects not always emphasized in community college.

Around that time he was also going on what he calls “this whole religious journey”: “I initially tried to be a Catholic priest, and then I became an atheist for a while, but I had this problem. I rationally convinced myself that religion was bullshit and there is no God, but I couldn’t make myself feel it.” Then he read the Quran and “became convinced that it was a prophetic work. And so I was like, Well, if I believe that Muhammad is a prophet and I believe in God, that’s the two big things. So maybe I am a Muslim.” Soon after, he changed his name. Then, just when he was getting ready to transfer out of community college, his twin brother, Christian, was killed on deployment in Afghanistan. He chose to go somewhere close to his grieving family, the University of Arizona, to finish his degree in Near-Eastern studies and philosophy.

The same dispassionate analysis that he applies to his own life’s progress he brings to bear on America’s trends, especially the Great Awokening. He traces that widespread and sudden movement in attitudes not to the death of Trayvon Martin or Michael Brown, nor to Black Lives Matter or the #MeToo movement, nor to the election of Donald Trump, but to September 2011 and the Occupy Wall Street movement that emerged from the ashes of the financial crisis.

“In reality, Occupy was not class oriented,” he argues. By focusing its critique on the top 1 percent of households, which were overwhelmingly white, and ignoring the immense privilege of the more diverse symbolic capitalists just beneath them, the movement, “if anything, helped obscure important class differences and the actual causes of social stratification.” This paved the way for “elites who hail from historically underrepresented populations … to exempt themselves from responsibility for social problems and try to deflect blame onto others.”

[Read: The 9.9 percent is the new American aristocracy]

Al-Gharbi is neither an adherent of wokeism nor an anti-woke scold. He would like to both stem the progressive excesses of the summer of 2020, a moment when white liberals “tended to perceive much more racism against minorities than most minorities, themselves, reported experiencing,” and see substantive social justice be achieved for everyone, irrespective of whether they hail from a historically disadvantaged identity group or not. The first step, he argues, is to dispel the notion that the Great Awokening was “some kind of unprecedented new thing.”

Awokenings, in al-Gharbi’s telling, are struggles for power and status in which symbolic capitalists, often instinctively and even subconsciously, leverage social-justice discourse not on behalf of the marginalized but in service of their own labor security, political influence, and social prestige. He does not see this as inherently nefarious—indeed, like Tocqueville and many others before him, he recognizes that motivated self-interest can be the most powerful engine for the common good. Al-Gharbi argues that our current Awokening, which peaked in 2021 and is now winding down, is really the fourth such movement in the history of the United States.

The first coincided with the Great Depression, when suddenly “many who had taken for granted a position among the elite, who had felt more or less entitled to a secure, respected, and well-paying professional job, found themselves facing deeply uncertain futures.”

The next would take place in the 1960s, once the radicals of the ’30s were firmly ensconced within the bourgeoisie. “The driver was not the Vietnam War itself,” al-Gharbi stresses. That had been going on for years without protest. Nor was the impetus the civil-rights movement, gay liberation, women’s liberation, or any such cause. “Instead, middle-class students became radical precisely when their plans to leave the fighting to minorities and the poor by enrolling in college and waiting things out began to fall through,” he argues. “It was at that point that college students suddenly embraced anti-war activism, the Black Power movement, feminism, postcolonial struggles, gay rights, and environmentalism in immense numbers,” appropriating those causes for their own gain.

If this sounds familiar, it should. The third Awokening was smaller and shorter than the others, stretching from the late ’80s to the early ’90s, and repurposing and popularizing the Marxist term political correctness. Its main legacy was to set the stage for the fourth—and present—Awokening, which has been fueled by what the scholar Peter Turchin has termed “elite overproduction”: Quite simply, America creates too many highly educated, highly aspirational young people, and not enough high-status, well-paid jobs for them to do. The result, al-Gharbi writes, is that “frustrated symbolic capitalists and elite aspirants [seek] to indict the system that failed them—and also the elites that did manage to flourish—by attempting to align themselves with the genuinely marginalized and disadvantaged.” It is one of the better and more concise descriptions of the so-called cancel culture that has defined and bedeviled the past decade of American institutional life. (As Hannah Arendt observed in The Origins of Totalitarianism, political purges often serve as jobs programs.)  

The book is a necessary corrective to the hackneyed discourse around wealth and privilege that has obtained since 2008. At the same time, al-Gharbi’s focus on symbolic capitalists leaves many levers of power unexamined. Whenever I’m in the company of capitalist capitalists, I’m reminded of the stark limitations of the symbolic variety. Think of how easily Elon Musk purchased and then destroyed that vanity fair of knowledge workers formerly known as Twitter. While some self-important clusters of them disbanded to Threads or Bluesky to post their complaints, Musk helped Trump win the election. His PAC donated $200 million to the campaign, while Musk served as Trump’s hype man at rallies and on X. Trump has since announced that Musk will be part of the administration itself, co-leading the ominously named Department of Government Efficiency.

Al-Gharbi’s four Great Awokenings framework can sometimes feel too neat. In a review of We Have Never Been Woke in The Wall Street Journal, Jonathan Marks points out a small error in the book. Al-Gharbi relies on research by Richard Freeman to prove that a bust in the labor market for college graduates ignited the second Awokening. But al-Gharbi gets the date wrong: “Freeman’s comparison isn’t between 1958 and 1974. It’s between 1968 and 1974”—too late, Marks argued, to explain what al-Gharbi wants it to explain. (When I asked al-Gharbi about this, he acknowledged the mistake on the date but insisted the point still held: “The thing that precipitated the massive unrest in the 1960s was the changing of draft laws in 1965,” he said. “A subsequent financial crisis made it tough for elites to get jobs, ramping things up further.” He argued it was all the same crisis: an expanding elite “growing concerned that the lives and livelihoods they’d taken for granted are threatened and may, in fact, be out of reach.”)

Despite such quibbles, al-Gharbi’s framework remains a powerful one. By contrasting these periods, al-Gharbi stressed to me, we can not only understand what is happening now but also get a sense of the shape of wokenesses to come. As he sees it, “the way the conversation often unfolds is just basically saying wokeness is puritanism or religion,” he explained. “They think Puritanism sucks, or religion sucks,” he continued. But just saying that “wokeness is bad” is not “super useful.”

Indeed, one of the primary reasons such anti-woke reactions feel so unsatisfactory is that wokeness, not always but consistently, stems from the basic recognition of large-scale problems that really do exist. Occupy Wall Street addressed the staggering rise of inequality in 21st-century American life; Black Lives Matter emerged in response to a spate of reprehensible police and vigilante killings that rightfully shocked the nation’s conscience; #MeToo articulated an ambient sexism that degraded women’s professional lives and made us consider subtler forms of exploitation and abuse. The self-dealing, overreach, and folly that each of these movements begat does not absolve the injustices they emerged to address. On the contrary, they make it that much more urgent to deal effectively with these ills.

[Musa al-Gharbi: Police punish the ‘good apples’]

Any critique of progressive illiberalism that positions the latter as unprecedented or monocausal—downstream of the Civil Rights Act, as some conservatives like to argue—is bound not only to misdiagnose the problem but to produce ineffective or actively counterproductive solutions to it as well. Wokeness is, for al-Gharbi, simply the way in which a specific substratum of elites “engage in power struggles and struggles for status,” he said. “Repealing the Civil Rights Act or dismantling DEI or rolling back Title IX and all of that will not really eliminate wokeness.”

Neither will insisting that its adherents must necessarily operate from a place of bad faith. In fact, al-Gharbi believes it is the very sincerity of their belief in social justice that keeps symbolic capitalists from understanding their own behavior, and the counterproductive social role they often play. “It’s absolutely possible for someone to sincerely believe something,” al-Gharbi stressed, “but also use it in this instrumental way.”

Having been born into one minority group and converted to another as an adult, al-Gharbi has himself accrued academic pedigree and risen to prominence, in no small part, by critiquing his contemporaries who flourished during the last Great Awokening. He is attempting to outflank them, too, aligning himself even more fully with the have-nots. Yet his work is permeated by a refreshing awareness of these facts. “A core argument of this book is that wokeness has become a key source of cultural capital among contemporary elites—especially among symbolic capitalists,” he concedes. “I am, myself, a symbolic capitalist.”

The educated knowledge workers who populate the Democratic Party need more of this kind of clarity and introspection. Consider recent reports that the Harris campaign declined to appear on Joe Rogan’s podcast in part out of concerns that it would upset progressive staffers, who fussed over language and minuscule infractions while the country lurched toward authoritarianism.

Al-Gharbi’s book’s title is drawn from Bruno Latour’s We Have Never Been Modern, which famously argued for a “symmetrical anthropology” that would allow researchers to turn the lens of inquiry upon themselves, subjecting modern man to the same level of analytical rigor that his “primitive” and premodern counterparts received. What is crucial, al-Gharbi insists, “is not what’s in people’s hearts and minds.” Rather the question must always be: “How is society arranged?” To understand the inequality that plagues us—and then to actually do something about it—we are going to have to factor in ourselves, our allies, and our preferred narratives too. Until that day, as the saying about communism goes, real wokeness has never even been tried.