Itemoids

United States

How Russia outmaneuvered the US in Africa

CNN

www.cnn.com › 2023 › 01 › 31 › opinions › russia-africa-wagner-lavrov-putin-us-davis › index.html

Russia seems to be outmaneuvering the United States in Africa. In recent days, Russian Foreign Minister Sergei Lavrov underscored that stark reality as he wined and dined his way through a tour of four African capitals.

The Internet Loves an Extremophile

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 01 › internet-youtube-podcast-guru-influencers-andrew-tate › 672867

On YouTube, a British influencer named Tom Torero was once the master of “daygame”—a form of pick-up artistry in which men approach women on the street. “You’ll need to desensitise yourself to randomly chatting up hot girls sober during the day,” Torero wrote in his 2018 pamphlet, Beginner’s Guide to Daygame. “This takes a few months of going out 3-5 times a week and talking to 10 girls during each session.”

Torero promised that his London Daygame Model—its five stages were open, stack, vibe, invest, and close—could turn any nervous man into a prolific seducer. This made him a hero to thousands of young men, some of whom I interviewed when making my recent BBC podcast series, The New Gurus. One fan described him to me as  “a free spirit who tried to help people,” and “a shy, anxious guy who reinvented himself as an adventurer.” To outsiders, though, daygame can seem unpleasantly clinical, with its references to “high-value girls,” and even coercive: It includes strategies for overcoming “LMR,” which stands for “last-minute resistance.” In November 2021, Newsweek revealed that Torero was secretly recording his dates—including the sex—and sharing the audio with paying subscribers to his website. Torero took down his YouTube channel, although he had already stopped posting regularly.

[Read: To learn about the far right, start with the ‘manosphere]

This was the narrative I had expected to unravel—how a quiet, nerdy schoolteacher from Wales had built a devoted following rooted in the backlash to feminism. Instead, I found a more surprising story: Tom Torero was what I’ve taken to calling an “extremophile,” after the organisms that carve out an ecological niche in deserts, deep-ocean trenches, or highly acidic lakes. He was attracted to extremes. Even while working in an elementary school, he was doing bungee jumps in Switzerland.

As churchgoing declines in the United States and Britain, people are turning instead to internet gurus, and some personality types are particularly suited to thriving in this attention economy. Look at the online preachers of seduction, productivity, wellness, cryptocurrency, and the rest, and you will find extremophiles everywhere, filling online spaces with a cacophony of certainty. Added to this, the algorithms governing social media reward strong views, provocative claims, and divisive rhetoric. The internet is built to enable extremophiles.

In his daygame videos and self-published books, Tom recounted a familiar manosphere backstory of being bullied by his male peers and friend-zoned by girls. But that wasn’t the whole picture. While doing my research, I received a message from Tom’s ex-wife. (In the podcast, we called her Elizabeth, a pseudonym, because she feared reprisals from his fans.) Elizabeth said she had been at university with Tom Ralis—his birth name—at the turn of the century. They’d met in the choir. He was “quite tall, and quite gawky … he had a kind of lopsided grin and he was sort of cheery and chirpy and wanted to make people laugh,” she told me. Elizabeth was a music student, and she was—unusual for Britain—a follower of the Greek Orthodox faith. How funny, Tom had said. He was interested in that religion too. But he didn’t expect to become her boyfriend. He was happy just to be friends.

[Read: To learn about the far right, start with the ‘manosphere’]

When Elizabeth’s father had a car accident, though, Tom started love bombing her. He turned up at her room in college with tea bags and biscuits, and told her that he did in fact want to date her. This proposal came with an implicit threat: “If I wouldn’t be with him, he would disappear,” she told me. “And the way that he talked about it … there was a kind of threat of suicide, that he would kill himself if I wouldn’t be with him.”

Confused, worried, and under pressure, Elizabeth said she “let him take over.” She began to date Tom, and they got married while still at university. Then, she recounted, they moved to a Greek island, where Elizabeth taught English, and Tom, who had started dressing all in black, went on a pilgrimage to Mount Athos—an Orthodox monastery that bans women and even female animals to maintain its purity. When he returned, Elizabeth said, Tom announced that he wanted to become a monk.

I was surprised by this revelation: The man who became famous for teaching seduction had considered a vow of celibacy? But to Elizabeth, the announcement made perfect sense. When she first met Tom, he was a biology student who “hero-worshipped” the geneticist and atheist Richard Dawkins, she said, before he became “disillusioned with science and rationalism.” The common thread between all of these different Toms—Ralis and Torero; ardent atheist, wannabe monk, and YouTube pick-up artist—was a psychological need, a desire to be respected, to be listened to, to be a preacher. It was the role he wanted. The subject matter that he preached about came second.

[Read: Am I being love bombed? Are you?]

Not every internet guru follows this pattern. Some influencers have developed a genuine interest in a single topic and decided to make it into a career. But many other corners of the internet are full of serial enthusiasts who have pinballed from one ideology to another, believing in each one deeply as they go. These flexible evangelists are perfectly suited to becoming online gurus. They believe, and they need to preach—and because of the lack of gatekeeping on social media, the most talented talkers can easily find an audience online.

Andrew Tate is another extremophile. The misogynist influencer, a former kickboxer and reality-show contestant, used to describe himself as an atheist, but he announced last year that he had converted to Islam because—as one interviewer, the British rapper Zuby, summarized Tate’s view—“Christianity is kinda cucked.” Once Tate decided that God exists—which he had deduced because evil exists, and therefore so must its opposite—it was important to him to find the religion he deemed the most hard-core. (After all, a man who keeps swords in his house could not have become a mild-mannered Episcopalian.) On the other side of the gender divide, Mikhaila Peterson, a second-generation influencer who became known for advocating a “lion diet” as a cure for immune conditions, revealed in 2021 that she had found God through taking psychedelics. She now talks about religion healing her soul with the same intensity that she speaks about her all-meat diet healing her body.

Shortly after Tom Ralis returned from Mount Athos, Elizabeth escaped the Greek island, and their marriage. When they divorced in 2006, YouTube was in its infancy. Throughout the 2010s, she would search for him online occasionally, and she watched him develop his daygame model. It was like the love-bombing technique he had used on her but condensed from several months into a single date. In December 2021, she discovered from a text message sent by a mutual friend that Tom had taken his own life. He had often spoken of his experience with depression, but his death still shocked her. In April last year, several of his online friends organized a tribute in London, and talked about Torero’s effect on their life. He had successfully become the secular online version of a preacher—a YouTube guru.

Tom Torero wanted to be an authority figure, and he found the cultural script that best fulfilled his needs. On my journey through the gurusphere, I encountered many stories like his. Take Maajid Nawaz, whom The New York Times anointed a member of the “Intellectual Dark Web” in 2018. Before becoming famous as a heterodox public intellectual, Nawaz had been jailed in Egypt for four years in the early 2000s for being a member of the Islamist group Hizb-ut-Tahrir. After renouncing that ideology, he became an antiextremism adviser to then-Conservative Prime Minister David Cameron, and at the same time stood as a candidate for Britain’s centrist party, the Liberal Democrats. Having failed to succeed in politics, Nawaz became a talk-radio host and became radicalized again, this time into COVID denialism. He left the broadcaster LBC in January 2022 after claiming that mandatory vaccination was “a global palace coup” by “fascists who seek the New World Order.”

[Cynthia Miller-Idriss: Extremism has spread into the mainstream]

Nawaz is, I would argue, another extremophile. This 2015 description of him by The Guardian could just as easily apply to Tom Torero: “Nawaz’s powers of verbal persuasion are something even his detractors concede. There’s a strong line to take in every answer. But equally, there’s very little sense of being open to persuasion himself.” Unlike most of us, with our needling doubts and fumbling hesitation, extremophiles are fervent in whatever their current belief is. And they want to tell other people about it.

For this reason, extremophiles have always made particularly good op-ed columnists—and now podcasters and YouTubers. The Hitchens brothers are a traditional example: Christopher was a Trotskyist as a young man, yet he became a supporter of the ultimate establishment project, the Iraq War. Peter moved from socialism to social conservatism, and has used his Mail on Sunday column to oppose strict COVID policies. Their analogue in the social-media age is James Lindsay. He believes that America is under threat from a Marxist-pedophile alliance, and he frequently collaborates with the Christian Nationalist Michael O’Fallon. But Lindsay first entered public life in the 2010s, writing books in support of New Atheism. At that time, he saw himself on the left. Although his middle name is Stephen, he told me that he wrote his atheist books as “James A. Lindsay” to deflect any backlash from the conservative community where he lived. As far as he is concerned, he has always been a rebel against the prevailing political climate.

Not everyone with an internet following is an extremophile. Someone like Russell Brand, a left-wing British comedian and actor now dabbling in anti-vax rhetoric and conspiracy theories about shadowy elites “concretizing global power,” strikes me as having a different psychological makeup. He is merely a heat-seeking missile for attention. His mirror image on the right is Dave Rubin, a gay man who has built a fan base among social conservatives opposed to homosexuality, as well as a Trumpist who—sensing the wind changing—recently boasted about attending the inauguration of Florida Governor Ron DeSantis.

Extremophiles are more like the sociologist Eric Hoffer’s “true believers,” the people who fuel mass movements. “The opposite of the religious fanatic is not the fanatical atheist but the gentle cynic who cares not whether there is a God or not,” Hoffer wrote in 1951. Hoffer’s formulation reminded me of a friend telling me about a mutual acquaintance who had been in two cults. I felt like Oscar Wilde’s Lady Bracknell: To be in one cult may be regarded as a misfortune; to join two looks like carelessness. Or think about the Mitford sisters, the quintessential English aristocrats of the early 20th century. As children, Unity was a fascist, and Decca was a communist. Their childhood sitting room was divided down the middle; one side had copies of Der Stürmer and Mein Kampf; the other had hammers and sickles. The only point of political agreement between the two girls was that the mere conservatives and liberals who visited the house were boring.

My journey reporting on the gurusphere has led me to confront my own extremophile tendencies. After being raised Catholic, I became interested in New Atheism in the 2000s, because it was a countercultural phenomenon. Like pretty much everyone else, I would argue that my political beliefs are all carefully derived from first principles. But the ones that I choose to write about publicly are clearly influenced by my own self-image as an outsider and a contrarian. Being self-aware about that helps me remember that my fear of normiedom has to be kept in check, because the conventional wisdom is often right.

Researchers of extremism are now studying its psychological causes as keenly as they are its political ones. “Psychological distress—defined as a sense of meaninglessness that stems from anxious uncertainty—stimulates adherence to extreme ideologies,” wrote the authors of a 2019 paper on the topic. Many people become radicalized through “a quest for significance—the need to feel important and respected by supporting a meaningful cause.” The COVID pandemic was so radicalizing because one single highly conspicuous issue presented itself at exactly the same time that many people were bored, lonely, and anxious. Cults usually try to isolate their followers from their social-support networks; during the pandemic, people did that all by themselves.

The extremophile model helps us make sense of political journeys that are otherwise baffling to us, like the monastery-to-pick-up-artist pipeline. We might be tempted to ask: Who was the real Tom Torero—atheist bro, aspirant monk, or master seducer? The answer is: all of them. He was a true believer, just not a monogamous one.

Are American Men Finally Rejecting Workism?

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › american-rich-men-work-less-hours-workism › 672895

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

One of the weirdest economic stories of the past half century is what happened to rich Americans—and especially rich American men—at work.

In general, poor people work more than wealthy people. This story is consistent across countries (for example, people in Cambodia work much more than people in Switzerland) and across time (for example, Germans in the 1950s worked almost twice as much as they do today).

But starting in the 1980s in the United States, this saga reversed itself. The highest-earning Americans worked longer and longer hours, in defiance of expectations or common sense. The members of this group, who could have bought anything they wanted with their wealth, bought more work. Specifically, from 1980 to 2005, the richest 10 percent of married men increased their work hours by more than any other group of married men: about five hours a week, or 250 hours a year.

In 2019, I called this phenomenon “workism.” In a time of declining religiosity, rich Americans seemed to turn to their career to fill the spiritual vacuum at the center of their life. For better or (very often) for worse, their desk had become their altar.

Since then, the concept of workism has been attached to a range of cultural and political phenomena, including declining fertility trends in the West. I’ve blamed workism for U.S. policies that resist national parental and sick leave because of an elite preference for maximizing the public’s attachment to the labor force.

Then the pandemic happened. I didn’t know how the forcible end of white-collar commutes and the demise of the default office would change affluent American attitudes. I assumed that remote work would make certain aspects of workism even more insidious. Researchers at Microsoft found that the boomlet in online meetings was pushing work into odd hours of the week, leading to more “just finishing up on email!” late nights, and Saturday mornings that felt like mini-Mondays. Working on our computer was always a “leaky” affair; with working from home and COVID, I feared the leak would become a flood.

But I was wrong. This year, Washington University researchers concluded that, since 2019, rich Americans have worked less. And less, and less. In a full reversal of the past 50 years, the highest-educated, highest-earning, and longest-working men reduced their working hours the most during the pandemic. According to the paper, the highest-earning 10 percent of men worked 77 fewer hours in 2022 than that top decile did in 2019—or 1.5 hours less each week. The top-earning women cut back by 29 hours. Notably, despite this reduction, rich people still work longer hours overall.

This analysis may have been thrown off by untrustworthy survey responses received during the chaos of the pandemic. But according to The Wall Street Journal, separate data from the Census Bureau back up that conclusion. From 2019 to 2021, married men reduced their workweek by a little more than an hour. Unmarried men had no similar decline.

So why are rich married men suddenly—and finally—reducing their working hours, by an unusual degree? Yongseok Shin, an economist at Washington University and a co-author of the paper, told me that he had “no doubt that this was a voluntary choice.” When I asked him if perhaps rich married men had worked less in dual-earner households to help with kids during the early pandemic period, he told me that their working hours continued falling in 2022, “long after the worst periods of school closures and issues with child-care centers.”

The title of the new paper is a bit misleading: “Where Are the Workers? From Great Resignation to Quiet Quitting.” The authors make frequent references to quiet quitting, the notion that workers in 2022 suddenly decided to reduce their collective ambition and effort. But their analysis doesn’t actually find anything like that. In the past three years, the median worker hardly reduced his or her hours. All of the decline in hours worked happened among the highest-earning Americans, with the longest workweeks. Is that an outbreak of quiet quitting? I’d say no. It’s more like the fever of workism is finally breaking among the most workaholic Americans.

“I think the pandemic has clearly reduced workaholism,” Shin told me. “And by the way, I think that’s a very positive thing for this country.”

I’m inclined to agree. In the years since I wrote the workism essay, I’ve toggled between two forms of writer’s guilt. Some days, I worry that I went too hard on people who are devoted to their job. If people can find solace and structure and a sense of control in their labor, who am I to tell them that they are suffering from an invisible misery by worshipping a false and marketized god?

But on other days, I think I wasn’t hard enough on workism, given how deeply it has insinuated itself into American values. The New York Times and Atlantic writer David Brooks has distinguished between what he calls “résumé virtues” and “eulogy virtues.” Résumé virtues are what people bring to the marketplace: Are they clever, devoted, and ambitious employees? Eulogy virtues are what they bring to relationships not governed by the market: Are they kind, honest, and faithful partners and friends?

Americans should prioritize eulogy virtues. But by our own testimony, we strongly prefer résumé virtues for ourselves and especially for our children. This year, Pew Research Center asked American parents: What accomplishments or values are most important for your children as they become adults? Nearly nine in 10 parents named financial security or “jobs or careers [our children] enjoy” as their top value. That was four times more than the share of parents who said it was important for their children to get married or have children; it was even significantly higher than the percentage of parents who said it’s extremely important for their kids to be “honest,” “ethical,” “ambitious,” or “accepting of people who are different.” Despite large differences among ethnicities in some categories, the primacy of career success was one virtue that cut across all groups.

I can’t read those survey results without thinking about the fact that teenage anxiety has been steadily rising for the past decade. Commentators sometimes blame a technological cocktail of smartphone use and social media for the psychological anguish of American youth. But perhaps a latent variable is the reverberation of workism in the next generation. These surveys suggest that everything society ought to consider bigger than work—family, faith, love, relationships, ethics, kindness—turns out to be secondary.

The message from American parents, in a century of economic instability, seems to be Your career is up here, and everything else is down there. Is there any scenario in which this is good for us? People can control their character in a way that they can’t control their lifetime earnings. In the ocean of the labor market, we’re all minnows, often powerless to shape our own destiny. It can’t be healthy for a society to convince its young people that professional success, the outcome of a faceless market, matters more to life than values such as human decency, which require only our own adherence.

I don’t know what will happen to workism in the next decade, but if rich American men are beginning to ease up on the idea that careerism is the tentpole of identity, the benefits could be immense—for their generation and the ones to come.

Office hours are back! Join Derek Thompson and special guests for conversations about the future of work, technology, and culture. The next session will be February 6. Register here and watch a recording anytime on The Atlantic’s YouTube channel.

Did George Washington Burn New York?

The Atlantic

www.theatlantic.com › magazine › archive › 2023 › 03 › george-washington-burn-new-york-great-fire-1776 › 672780

This story seems to be about:

On July 9, 1776, General George Washington amassed his soldiers in New York City. They would soon face one of the largest amphibious invasions yet seen. If the British took the city, they’d secure a strategic harbor on the Atlantic Coast from which they could disrupt the rebels’ seaborne trade. Washington thus judged New York “a Post of infinite importance” and believed the coming days could “determine the fate of America.” To prepare, he wanted his men to hear the just-issued Declaration of Independence read aloud. This, he hoped, might “serve as a fresh incentive.”

But stirring principles weren’t enough. By the end of August, the British had routed Washington’s forces on Long Island and were preparing to storm Manhattan. The outlook was “truly distressing,” he confessed. Unable to hold the city—unable even to beat back disorder and desertion among his own dispirited men—Washington abandoned it. One of his officers ruefully wished that the retreat could be “blotted out of the annals of America.”

As if to underscore the loss, a little past midnight five days after the redcoats took New York on September 15, a terrible fire broke out. It consumed somewhere between a sixth and a third of the city, leaving about a fifth of its residents homeless. The conflagration could be seen from New Haven, 70 miles away.

New York’s double tragedy—first invaded, then incinerated—meant a stumbling start for the new republic. Yet Washington wasn’t wholly displeased. “Had I been left to the dictates of my own judgment,” he confided to his cousin, “New York should have been laid in Ashes before I quitted it.” Indeed, he’d sought permission to burn it. But Congress refused, which Washington regarded as a grievous error. Happily, he noted, God or “some good honest Fellow” had torched the city anyway, spoiling the redcoats’ valuable war prize.

For more than 15 years, the historian Benjamin L. Carp of Brooklyn College has wondered who that “honest fellow” might have been. Now, in The Great New York Fire of 1776: A Lost Story of the American Revolution, he cogently lays out his findings. Revolutionaries almost certainly set New York aflame intentionally, Carp argues, and they quite possibly acted on instructions. Sifting through the evidence, he asks a disturbing question: Did George Washington order New York to be burned to the ground?

The idea of Washington as an arsonist may seem far-fetched. Popular histories of the American Revolution treat the “glorious cause” as different from other revolutions. Whereas the French, Haitian, Russian, and Chinese revolutions involved mass violence against civilians, this one—the story goes—was fought with restraint and honor.

But a revolution is not a dinner party, as Mao Zedong observed. Alongside the parade-ground battles ran a “grim civil war,” the historian Alan Taylor writes, in which “a plundered farm was a more common experience than a glorious and victorious charge.” Yankees harassed, tortured, and summarily executed the enemies of their cause. The term lynch appears to have entered the language from Colonel Charles Lynch of Virginia, who served rough justice to Loyalists.

Burning towns was, of course, a more serious transgression. “It is a Method of conducting War long since become disreputable among civilized Nations,” John Adams wrote. The Dutch jurist Hugo Grotius, whose writings influenced European warfare, forbade killing women and children, and judged unnecessary violence in seizing towns to be “totally repugnant to every principle of Christianity and justice.”

Still, in the thick of war, the torch was hard to resist, and in North America, it was nearly impossible. Although Britain, facing a timber famine, had long since replaced its wooden buildings with brick and stone ones, the new United States was awash in wood. Its immense forests were, to British visitors, astonishing. And its ramshackle wooden towns were tinderboxes, needing only sparks to ignite.

On the eve of the Revolution, the rebel Joseph Warren gave a speech in a Boston church condemning the British military. Vexed British officers cried out “Oh! fie! Oh! fie!” That sounded enough like “fire” to send the crowd of 5,000 sprinting for the doors, leaping out windows, and fleeing down the streets. They knew all too well how combustible their city was.

The British knew it too, which raised the tantalizing possibility of quashing the rebellion by burning rebel towns. Although some officers considered such tactics criminal, others didn’t share their compunctions. At the 1775 Battle of Bunker Hill, they burned Charlestown, outside Boston, so thoroughly that “scarcely one stone remaineth upon another,” Abigail Adams wrote. The Royal Navy then set fire to more than 400 buildings in Portland, Maine (known then as Falmouth). On the first day of 1776, it set fires in Norfolk, Virginia; the city burned for three days and lost nearly 900 buildings.

Thomas Paine’s Common Sense appeared just days after Norfolk’s immolation. In it, Paine noted the “precariousness with which all American property is possessed” and railed against Britain’s reckless use of fire. As Paine appreciated, torched towns made the case for revolution pointedly. “A few more of such flaming Arguments as were exhibited at Falmouth and Norfolk” and that case would be undeniable, Washington agreed. The Declaration of Independence condemned the King for having “burnt our towns.”

In Norfolk, however, the King had help. After the British lit the fires, rebel Virginia soldiers kept them going, first targeting Loyalist homes but ultimately kindling a general inferno. “Keep up the Jigg,” they cried as the buildings burned. From a certain angle, this made sense: The fire would deny the Royal Navy a port, and the British would take the blame. In early February a revolutionary commander, Colonel Robert Howe, finished the job by burning 416 remaining structures. The city is “entirely destroyed,” he wrote privately. “Thank God for that.”

A year later, the Virginia legislature commissioned an investigation, which found that “very few of the houses were destroyed by the enemy”—only 19 in the New Year’s Day fire—whereas the rebels, including Howe, had burned more than 1,000. That investigation’s report went unpublished for six decades, though, and even then, in 1836, it was tucked quietly into the appendix of a legislative journal. Historians didn’t understand who torched Norfolk until the 20th century.

This was presumably by design: The Revolution required seeing the British as incendiaries and the colonists as their victims. Washington hoped that Norfolk’s ashes would “unite the whole Country in one indissoluble Band.”

Carp believes that what happened in Norfolk happened in New York. But how to square that with Washington’s renowned sense of propriety? The general detested marauding indiscipline among his men. Toward enemy prisoners, he advocated “Gentleness even to Forbearance,” in line with the “Duties of Humanity & Kindness.” And he deemed British-set fires “Savage Cruelties” perpetrated “in Contempt of every Principle of Humanity.” Is it thinkable that he disobeyed orders and set a city full of civilians aflame?

It becomes more thinkable if you look at another side of the war, Carp notes. In popular memory, the Revolutionary War was between colonists and redcoats, with some French and Hessians pitching in. But this version leaves out the many Native nations that also fought, mostly alongside the British. The Declaration of Independence, after charging the King with arson, indicted him for unleashing “merciless Indian Savages, whose known rule of warfare is an undistinguished destruction of all ages, sexes and conditions.”

[From the May 2022 issue: Daniel Immerwahr reviews a new history of World War II]

This accusation—that Indigenous people fought unfairly—haunted discussions of war tactics. Redcoat attacks on American towns fed the revolutionary spirit precisely because they delegitimized the British empire, whose methods, John Adams wrote, were “more abominable than those which are practiced by the Savage Indians.”

Perhaps, but Adams’s compatriots, at least when fighting Indians, weren’t exactly paragons of enlightened warfare. A month after the Declaration of Independence complained about burned towns and merciless savages, the revolutionaries launched a 5,500-man incendiary expedition against the British-allied Cherokees, targeting not warriors but homes and food. “I have now burnt down every town and destroyed all the corn,” one commander reported.

This was hitherto the “largest military operation ever conducted in the Lower South,” according to the historian John Grenier. Yet it’s easily overshadowed in popular accounts by more famous encounters. The Pulitzer Prize–winning writer Rick Atkinson, in his painstakingly detailed, 800-page military history of the war’s first two years, The British Are Coming, spends just a paragraph on it. The Cherokee campaign was, Atkinson writes, a mere “postscript” to Britain’s short and unsuccessful siege of Charleston (even though, by Atkinson’s own numbers, it killed roughly 10 times as many as the Charleston siege did).

But the Cherokee campaign was important, not only for what it did to the Cherokees but for what it revealed about the revolutionaries. Washington brandished it as proof of how far his men were willing to go. The Cherokees had been “foolish” to support the British, he wrote to the Wolastoqiyik and Passamaquoddy peoples, and the result was that “our Warriors went into their Country, burnt their Houses, destroyed their corn and obliged them to sue for peace.” Other tribes should take heed, Washington warned, and “never let the King’s wicked Counselors turn your hearts against me.”

Indigenous people did turn their hearts against him, however, and the fighting that followed scorched the frontier. In one of the war’s most consequential campaigns, Washington ordered General John Sullivan in 1779 to “lay waste all the settlements” of the British-aligned Haudenosaunees in New York, ensuring that their lands were “not merely overrun but destroyed.” Sullivan complied. “Forty of their towns have been reduced to ashes—some of them large and commodious,” Washington observed. He commended Sullivan’s troops for a “perseverance and valor that do them the highest honor.”

It’s hard, looking from Indian Country, to see Washington—or any of the revolutionaries—as particularly restrained. In the 1750s, the Senecas had given him the name “Conotocarious,” meaning “town taker” or “town destroyer,” after the title they’d bestowed on his Indian-fighting great-grandfather. Washington had occasionally signed his name “Conotocarious” as a young man, but he fully earned it destroying towns during the Revolutionary War. “To this day,” the Seneca chief Cornplanter told him in 1790, “when that name is heard, our women look behind them and turn pale, and our children cling close to the neck of their mothers.”

Carp acknowledges but doesn’t linger over what the revolutionaries did on the frontier. As he shows, there’s enough evidence from Manhattan itself to conclude that the New York conflagration was intentional.

To start, this was perhaps the least surprising fire in American history. Rumors swirled through the streets that it would happen, and Washington’s generals talked openly of the possibility. The president pro tempore of New York’s legislature obligingly informed Washington that his colleagues would “chearfully submit to the fatal Necessity” of destroying New York if required. The fire chief buried his valuables in anticipation.

When the expected fire broke out, it seemed to do so everywhere simultaneously. Those watching from afar “saw the fire ignite in three, four, five, or six places at once,” Carp notes. He includes a map showing 15 distinct “ignition points,” where observers saw fires start or found suspicious caches of combustibles. The fire could have begun in just one place and spread by wind-borne embers, but to those on the scene it appeared to be the work of many hands.

As the fire raged, witnesses saw rebels carrying torches, transporting combustibles, and cutting the handles of fire buckets. Some offenders allegedly confessed on the spot. But, as often happens with arson, the evidence vanished in the smoke. The British summarily executed some suspects during the fire, others fled, and those taken into custody all denied involvement.

Months elapsed before the British secured their first major confession. They caught a Yankee spy, Abraham Patten, who’d been plotting to torch British-held New Brunswick. On the gallows, Patten confessed, not only to the New Brunswick scheme but also to having been a principal in the conspiracy to burn New York. “I die for liberty,” he declared, “and do it gladly, because my cause is just.”

[Amy Zegart: George Washington was a master of deception]

After Patten’s execution, Washington wrote to John Hancock, the president of the Continental Congress. Patten had “conducted himself with great fidelity to our cause rendering Services,” Washington felt, and his family “well deserves” compensation. But, Washington added, considering the nature of Patten’s work, a “private donation” would be preferable to a “public act of generosity.” He’d made a similar suggestion when proposing burning New York. Washington had clarified that, if Congress agreed to pursue arson, its assent should be kept a “profound secret.”

It’s possible, given Carp’s circumstantial evidence, that New York radicals conspired to incinerate the city without telling the rebel command. Or perhaps Washington knew they would and feigned ignorance. Yet, for Carp, Patten’s confession and Washington’s insistence on paying Patten’s widow under the table amount to “a compelling suggestion that Washington and Congress secretly endorsed the burning of New York.”

Whoever burned the city, the act set the tone for what followed. As the war progressed, the British incinerated towns around New York and in the southern countryside. The rebels, for their part, fought fire with fire—or tried to. In 1778, Commodore John Paul Jones attacked an English port hoping to set it aflame, but he managed to burn only a single ship. Other attempts to send incendiaries to Great Britain were similarly ineffectual. British cities were too fireproof and too far for the revolutionaries to reach with their torches.

Vengeful Yankees had to settle for targets closer at hand: Native towns. In theory they were attacking Britain’s allies, but lines blurred. Pennsylvania militiamen searching for hostile Lenapes in 1782 instead fell on a village of pacifist Christian Indians, slaughtering 96 and burning it to the ground. If against the British the war was fought at least ostensibly by conventional means, against Indigenous people it was “total war,” the historian Colin G. Calloway has written.

That war continued well past the peace treaty signed in Paris—with no American Indians present—on September 3, 1783. Andrew Jackson’s arson-heavy campaigns against Native adversaries helped propel him to the presidency. Burning Indigenous lands was also key to William Henry Harrison’s election, in 1840. He won the White House on the slogan “Tippecanoe and Tyler Too”: Tyler was his running mate; “Tippecanoe” referred to the time in 1811 when Harrison’s troops had attacked an Indigenous confederacy and incinerated its capital.

Native Americans deserved such treatment, settlers insisted, because they always fought mercilessly, whereas white Americans did so only when provoked. Crucial to this understanding was a vision of the Revolution as a decorous affair, with Washington, venerated for his rectitude and restraint, at its head.

The legend of the pristine Revolution, however, is hard to sustain. The rebels lived in a combustible land, and they burned it readily, torching towns and targeting civilians. Like all revolutions, theirs rested on big ideas and bold deeds. But, like all revolutions, it also rested on furtive acts—and a thick bed of ashes.

This article appears in the March 2023 print edition with the headline “Did George Washington Burn New York?”

Florida Has a Right to Destroy its Universities

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 01 › florida-desantis-universities › 672898

This story seems to be about:

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Elections have consequences. Florida’s governor has decided to root out wrong-think at one of Florida’s public colleges, and his harebrained meddling will likely harm the school, but he has every right to do it.

But first, here are three new stories from The Atlantic.

Republicans’ 2024 magical thinking March 2023 cover story: We’ve lost the plot. Montana’s Black mayor

Florida’s Soviet Commissars

Florida’s governor, Ron DeSantis, has set out to ruin one of Florida’s public colleges. He’s appointed several board members to the ideologically progressive New College of Florida with, apparently, a mandate to somehow rebuild it and thus save it from its dreaded wokeification. Helpfully for the cause of screwing up a college, most of the new overseers aren’t from Florida and don’t live there; one of them, in fact, is Christopher Rufo, a young man from the Manhattan Institute who has no actual experience in higher education but does have a genuine talent for rhetoric that he seems to have gained at the Soviet Higher Institute of Pedagogy somewhere in Moscow or Leningrad circa 1970.

Bristling at criticism from the Harvard professor Steven Pinker, Rufo fired back on social media. “We’re in charge now,” he tweeted, adding that his goal was “constitutionally-mandated democratic governance, to correct the ideological corruption of *public universities.*”

As they would have said during those old Party meetings: The comrade’s remarks about implementing the just and constitutional demands of the People to improve ideological work in our educational collectives and remove corruption from the ranks of our teaching cadres were met with prolonged, stormy applause.

Rufo is part of a new generation of young right-wing activists who have managed to turn trolling into a career. Good for him, I guess, but these self-imagined champions of a new freedom are every bit as dogmatic as the supposed leftist authoritarians they think they’re opposing. Their demands for ideological purity are part of an ongoing hustle meant to convince ordinary Americans that the many institutions of the United States, from the FBI in Washington down to a college in Sarasota, are somehow all scheming against them.

But Rufo is absolutely right about one thing: If Ron DeSantis wants to put him in charge of a “top-down restructuring” of a Florida college, the governor has every right to do it.

Elections have consequences. If the people of Florida, through their electoral choices, want to wreck one of their own colleges, it is within the state’s legitimate power to do so. In fact, Florida could decide tomorrow to amend its own constitution and abolish state universities entirely. There’s no national right to a college education, and if Florida wants to unleash a battalion of Guy Montags on its own state colleges and their libraries—well, that’s up to the voters.

But something more important is going on here. At this point in any discussion of college education, we are all supposed to acknowledge that colleges have, in fact, become ridiculously liberal. There’s some truth to that charge; I included some stories of campus boobery when I wrote about the role of colleges in America some years back. And only a few weeks ago, I joined the many people blasting Hamline University for going off the rails and violating basic principles of academic freedom while infantilizing and overprotecting students.

Fine, so stipulated: Many colleges do silly things and have silly professors saying silly things.

But the Sovietization of the New College isn’t about any of that. Something has changed on the American right, which is now seized with a hostility toward higher education that is driven by cultural resentment, and not by “critical race theory” or any of the other terms that most Americans don’t even understand. College among conservatives has become a kind of shorthand for identifying with all kinds of populist grievances, a ploy used even by Republicans with Ivy League educations as a means of cozying up to its non-college-educated and resentful base.

GOP attitudes about education have changed fast. As recently as 2015, most Republicans, by a wide margin, thought of universities as a positive influence on the United States. Four years later, those numbers flipped, and nearly 60 percent of Republicans saw universities as having a negative impact on the country.

It doesn’t take a lot of sleuthing to realize that those four years tracked with the rise of Donald Trump and a movement whose populist catechism includes seething anger at “the elites,” a class that no longer means “people with money and power”—after all, Republicans have gobs of both—but rather “those bookish snobs who look down on our True Real-American Values.” The Republican message, aided by the usual hypocrites in the right-wing entertainment ecosystem (such as Tucker Carlson, a prep-school product who told kids to drop out of college but asked Hunter Biden for help getting his own son into Georgetown), is that colleges are grabbing red-blooded American kids and replacing them with Woke Communist Pod People.

This is a completely bizarre line of attack: It posits that a graduate student making a pittance grading exams is more “elite” than a rich restaurant owner. But it works like a charm, in part because how Americans measure their success (and their relative status) has shifted from the simple metric of wealth to less tangible characteristics about education and lifestyle. Our national culture, for both better and worse, has arguably become more of a monoculture, even in rural areas. And many Americans, now living in a hyperconnected world, are more aware of cultural differences and the criticism of others. Those self-defined “real Americans” partake in that same overall national culture, of course, but they nonetheless engage in harsh judgment of their fellow citizens that is at least as venomous as what they imagine is being directed by “the elites” back at them.

Which brings us back to DeSantis—a graduate, he would apparently like you to forget, of Harvard and Yale. DeSantis is now a “populist,” much like Trump (Penn), Ted Cruz (Princeton and Harvard), Josh Hawley (Stanford and Yale), and Elise Stefanik (Harvard and the Ferengi  Diplomatic Academy). He has tasked Rufo (Georgetown and Harvard) to “remake” a school meant for the sons and daughters of Florida’s taxpayers not so that he can offer more opportunity to the people of his state, but so that he can run for president as just one of the regular folks whom reporters flock to interview in diners across the mountains and plains of a great nation.

Look, I live in New England surrounded by excellent public and private institutions, and I candidly admit that I couldn’t care less what kind of damage Florida does to its own schools. If Florida parents really don’t want Ron DeSantis appointing ideological commissars to annoy deans and department chairs, then they should head to the ballot box and fix it. But in the meantime, faux populists, the opportunists and hucksters who infest the modern GOP, are going to undermine education for the people who need it the most: the youngsters who rely on public education. And that’s a tragedy that will extend far beyond whatever becomes of the careers of Ron DeSantis or Christopher Rufo.

Related:

How Ivy League elites turned against democracy The professors silenced by Ron DeSantis’s anti-critical-race-theory legislation

Today’s News

A sixth Memphis police officer has been suspended from the force during the investigation of Tyre Nichols’s death. The Manhattan District Attorney’s Office is starting to present evidence to a grand jury in its criminal investigation into Donald Trump. The evidence focuses on Trump’s role in paying hush money to an adult-film star during his 2016 campaign. The Ukrainian air force warned that it would not be able to defend against Iranian ballistic missiles, should Russia obtain them.

Dispatches

Up for Debate: Conor Friedersdorf collects reader perspectives on how to improve policing. Famous People: Lizzie and Kaitlyn attend a party with a very specific heart- and belly-warming theme. The Wonder Reader: Isabel Fattal explores how coffee became capitalism’s favorite drug.

Explore all of our newsletters here.

Evening Read

Quentin Tarantino and Uma Thurman during HBO Films Pre Golden Globes Party Inside Coverage at Chateau Marmont in Los Angeles, California (Jeff Kravitz / FilmMagic / Getty)

The Luxury Dilemma

By Xochitl Gonzalez

Behind vine-covered walls on a modest hill overlooking Sunset Boulevard sits the decidedly immodest Chateau Marmont. The hotel was inspired by a French Gothic castle and, at 93, it is easily the oldest thing in Los Angeles that’s still considered sexy.

As a born-and-raised New Yorker without a driver’s license, I found the hotel the perfect place to park myself for a day of meetings in the era before Ubers and WeWorks and Soho Houses. I used to go there in the 2000s, back when I was a wedding planner. It was like a celebrity safari; stars would walk by, within arm’s reach. You could “do Los Angeles” without ever needing to move. I never could have afforded a room there, but I knew by reputation that at night it offered entertainment of a different sort: luxury and licentiousness and debauchery, unbounded by any rules.

In more recent years, I’ve returned to Los Angeles in a different career—as a screenwriter traveling on someone else’s dime. Naturally, I didn’t want to just take meetings at the Chateau; I wanted to stay there, to be a fly on the wall where the wild things were. Only I couldn’t.

I was told, in early 2021, that the hotel was not taking any new bookings.

Read the full article.

More From The Atlantic

SNL is excelling in one particular way. Photos: the snow monkeys of Nagano Dear Therapist: Can I cut my mom off from my children if she won’t seek therapy?

Culture Break

Mia Goth and Alexander Skarsgård sit together in "Infinity Pool" (Neon Films)

Read. Poem Beginning With a Sentence From My Last Will & Testament,” by Donald Platt.

“Lucy, when I die, / I want you to scatter one-third of my ashes among the sand dunes / of Virginia Beach.”

Watch. Infinity Pool, in theaters, is a gory, existential horror film with a premise deliciously nasty enough to keep you invested—even if it can’t quite keep up with its initial hook.

Play our daily crossword.

P.S.

I usually take this final word in the Daily to direct you toward something fun or interesting, often derived from my admittedly oddball taste in pop culture. Today, I’m going to ask for your indulgence as I offer you something that I wrote yesterday in our Ideas section.

Some years ago, I wrote about the young losers and misfits among us who suddenly explode and commit mass murder. Even before the recent shootings in California (which actually are outliers in the general pattern of attacks by younger men), I’d decided to revisit this question. I wanted to think more about why America—and, yes, other nations as well—has produced so many lost young men who turn to performative and spectacular acts of murder or terrorism. I think the growth of narcissism is one of the answers, but I discuss it all at more length in this article, which I cannot say is pleasant reading but, I hope, offers a path toward more productive discussions about how to prevent such tragedies.

— Tom

Isabel Fattal contributed to this newsletter.

Airplane Toilets Could Catch the Next COVID Variant

The Atlantic

www.theatlantic.com › health › archive › 2023 › 01 › cdc-test-airplane-bathroom-wastewater-covid-tracking › 672893

Airplane bathrooms are not most people’s idea of a good time. They’re barely big enough to turn around in. Their doors stick, like they’re trying to trap you in place. That’s to say nothing of the smell. But to the CDC, those same bathrooms might be a data gold mine.

This month, the agency has been speaking with Concentric, the public-health and biosecurity arm of the biotech company Ginkgo Bioworks, about screening airplane wastewater for COVID-19 at airports around the country. Although plane-wastewater testing had been in the works already (a pilot program at John F. Kennedy International Airport, in New York City, concluded last summer), concerns about a new variant arising in China after the end of its “zero COVID” policies acted as a “catalyst” for the project, Matt McKnight, Ginkgo’s general manager for biosecurity, told me. According to Ginkgo, even airport administrators are getting excited. “There have been a couple of airports who have actually reached out to the CDC to ask to be part of the program,” Laura Bronner, Ginkgo’s vice president of commercial strategies, told me.

Airplane-wastewater testing is poised to revolutionize how we track the coronavirus’s continued mutations around the world, along with other common viruses such as flu and RSV—and public-health threats that scientists don’t even know about yet. Unlike sewer-wide surveillance, which shows us how diseases are spreading among large communities, airplane surveillance is precisely targeted to catch new variants entering the country from abroad. And unlike with PCR testing, passengers don’t have to individually opt in. (The results remain anonymous either way.) McKnight compares the technique to radar: Instead of responding to an attack after it’s unfolded, America can get advance warning about new threats before they cause problems. As we enter an era in which most people don’t center their lives on avoiding COVID-19, our best contribution to public health might be using a toilet at 30,000 feet.

Fundamentally, wastewater testing on airplanes is a smaller-scale version of the surveillance that has been taking place at municipal water networks since early 2020: Researchers perform genetic testing on sewage samples to determine how much coronavirus is present, and which variants are included. But adapting the methodology to planes will require researchers to get creative. For one thing, airplane wastewater has a higher solid-to-liquid ratio. Municipal sewage draws from bathing, cooking, washing clothes, and other activities, whereas airplane sewage is “mainly coming from the toilet,” says Kata Farkas, a microbiologist at Bangor University. For a recent study tracking COVID-19 at U.K. airports, Farkas and her colleagues had to adjust their analytical methods, tweaking the chemicals and lab techniques used to isolate the coronavirus from plane sewage.

Researchers also need to select flights carefully to make sure the data they gather are worth the effort of collecting them. To put it bluntly, not everyone poops on the plane—and if the total number of sampled passengers is very small, the analysis isn’t likely to return much useful data. “The number of conversations we’ve had about how to inconspicuously know how many people on a flight have gone into a lavatory is hysterical,” says Casandra Philipson, who leads the Concentric bioinformatics program. (Concentric later clarified that they do not have plans to actually monitor passengers’ bathroom use.) Researchers ended up settling on an easier metric: Longer flights tend to have more bathroom use and should therefore be the focus of wastewater testing. (Philipson and her colleagues also work with the CDC to test flights from countries where the government is particularly interested in identifying new variants.)

[Read: Are our immune systems stuck in 2020?]

Beyond those technical challenges, scientists face the daunting task of collaborating with airports and airlines—large companies that aren’t used to participating in public-health surveillance. “It is a tricky environment to work in,” says Jordan Schmidt, the director of product applications at LuminUltra, a Canadian biotech company that tests wastewater at Toronto Pearson Airport. Strict security and complex bureaucracies in air travel can make collecting samples from individual planes difficult, he told me. Instead, LuminUltra samples from airport terminals and from trucks that pull sewage out of multiple planes, so the company doesn’t need to get buy-in from airlines.

Airplane surveillance seeks to track new variants, not individual passengers: Researchers are not contact-tracing exactly which person brought a particular virus strain into the country. For that reason, companies such as Concentric aren’t planning to alert passengers that COVID-19 was found on their flight, much as some of us might appreciate that warning. Testing airplane sewage can identify variants from around the world, but it won’t necessarily tell us about new surges in the city where those planes land.

Airplane-wastewater testing offers several advantages for epidemiologists. In general, testing sewage is “dramatically cheaper” and “dramatically less invasive” than nose-swab testing each individual person in a town or on a plane, says Rob Knight, a medical engineering professor at UC San Diego who leads the university’s wastewater-surveillance program. Earlier this month, a landmark report from the National Academies of Sciences, Engineering, and Medicine (which Knight co-authored) highlighted international airports as ideal places to seek out new coronavirus variants and other pathogens. “You’re going to capture people who are traveling from other parts of the world where they might be bringing new variants,” Knight told me. And catching those new variants early is key to updating our vaccines and treatments to ensure that they continue to work well against COVID-19. Collecting more data from people traveling within the country could be useful too, Knight said, since variants can evolve at home as easily as abroad. (XBB.1.5, the latest variant dominating COVID-19 spread in the U.S., is thought to have originated in the American Northeast.) To this end, he told me, the CDC should consider monitoring large train stations or seaports too.

[Read: The COVID data that are actually useful now]

When wastewater testing first took off during the pandemic, the focus was mostly on municipal facilities, because they could provide data for an entire city or county at once. But scientists have since realized that a more specific view of our waste can be helpful, especially in settings that are crucial for informing public-health actions. For example, at NYC Health + Hospitals, the city’s public health-care system, wastewater data help administrators “see 10 to 14 days in advance if there are any upticks” in coronavirus, flu, or mpox, Leopolda Silvera, Health + Hospitals’ global-health deputy, told me. Administrators use the data in decisions about safety measures and where to send resources, Silvera said: If one hospital’s sewage indicates an upcoming spike in COVID-19 cases, additional staff can be added to its emergency department.

Schools are another obvious target for small-scale wastewater testing. In San Diego, Rebecca Fielding-Miller directed a two-year surveillance program for elementary schools. It specifically focused on underserved communities, including refugees and low-income workers who were hesitant to seek out PCR testing. Regular wastewater testing picked up asymptomatic cases with high accuracy, providing school staff and parents with “up to the minute” information about COVID-19 spread in their buildings, Fielding-Miller told me. This school year, however, funding for the program ran out.

Even neighborhood-level surveillance, while not as granular as sampling at a plane, hospital, or school, can provide more useful data than city-wide testing. In Boston, “we really wanted hyperlocal surveillance” to inform placements of the city’s vaccine clinics, testing sites, and other public-health services, says Kathryn Hall, the deputy commissioner at the city’s public-health agency. She and her colleagues identified 11 manhole covers that provide “good coverage” of specific neighborhoods and could be tested without too much disruption to traffic. When a testing site lights up with high COVID-19 numbers, Hall’s colleagues reach out to community organizations such as health centers and senior-living facilities. “We make sure they have access to boosters, they have access to PPE, they understand what’s going on,” Hall told me. In the nearby city of Revere, a similar program run by the company CIC Health showed an uptick in RSV in neighborhood wastewater before the virus started making headlines. CIC shared the news with day-care centers and helped them respond to the surge with educational information and PPE.

[Read: Whatever happened to toilet plumes?]

According to wastewater experts, hyperlocal programs can’t usher in a future of disease omnipotence all by themselves. Colleen Naughton, an environmental-engineering professor at UC Merced who runs the COVIDPoops19 dashboard, told me she would like to see communities with no wastewater surveillance get resources to set it up before more funding goes into testing individual buildings or manhole covers. The recent National Academies report presents a future of wastewater surveillance that includes both broad monitoring across the country and testing targeted to places where new health threats might emerge or where certain communities need local information to stay safe.

This future will require sustained federal funding beyond the current COVID-19 emergency, which is set to expire if the Biden administration does not renew it in April. The United States needs “better and more technology, with a funding model that supports its development,” in order for wastewater’s true potential to be realized, Knight said. Airplane toilets may very well be the best first step toward that comprehensive sewage-surveillance future.