Itemoids

Gen

The Death of American Exceptionalism

The Atlantic

www.theatlantic.com › ideas › archive › 2024 › 10 › youth-democracy-united-states-unique › 680344

The prevalence of positive illusions is one of the most well-established findings in psychology. Most people have an exaggerated view of their own abilities and expect that more good things—and fewer bad things—will happen to them than is likely.

Despite being unrealistic, such beliefs have benefits: Overly positive people are happier, cope better with adversity, and think they have more control over their life. Believing that things are a little better than they actually are may be necessary for robust mental health.

In a similar way, many citizens hold overly positive, but possibly necessary, beliefs about their country. A sense of national pride can foster community and bring people together, and it’s often a sign of a thriving democracy. In the United States, one source of patriotism is American exceptionalism—the idea that the U.S. is a unique, and uniquely superior, nation. With its origin as a democracy in a world of kingdoms and its emphasis on freedom and opportunity, this narrative goes, the American system is out of the ordinary.

Among the young, that belief is rapidly dying. Since 1976, a large nationally representative survey has asked U.S. high-school seniors, 17 and 18 years old, whether they agree that “Despite its many faults, our system of doing things is still the best in the world”: a fairly succinct summary of American exceptionalism. In the early 1980s, 67 percent of high-school seniors agreed that the U.S. system was the best. By 2022, only 27 percent did. Thus, only one out of four American teens now agrees that their country is exceptional.

[Read: 20-somethings are in trouble]

The decline appears to be mostly untethered to national events. Belief in American exceptionalism went down during the Great Recession of the late 2000s, and also during the economically prosperous years of the 2010s. It declined when the U.S. was at war and also when it was at peace. It declined as income inequality grew rapidly, from 1980 to 2000, and also as inequality moderated after 2000.

Support for the idea is now particularly unpopular among liberal teens. As recently as the late 1990s, a majority had agreed that the U.S. system was the best. By 2021–22, that had shrunk to 14 percent—only one out of seven. (Belief in American exceptionalism has declined among conservative teens as well, but much less so: 47 percent of conservative teens believed in the idea in 2021–22.)

Even the belief that the founding of the United States was a positive development seems to be on the way out: A recent poll conducted by the Democracy Fund asked Americans if the Founders are “better described as villains” or “as heroes.” Four out of 10 Gen Zers chose “villains,” compared with only one in 10 Boomers. If your country’s Founders are the bad guys instead of the good guys, it becomes much harder to believe that its system is the best in the world—or even worth defending. (Ideas about America are hardly the only beliefs that have bent toward pessimism among American youth in the past two decades. In early 2002, for instance, 23 percent of high-school seniors agreed with the statement “When I think about all the terrible things that have been happening, it is hard for me to hold out much hope for the world.” In early 2019, 40 percent agreed.)

Dour views of the nation’s status and possibilities may shape its future. Gen Z may be disillusioned, but it is not, by and large, nihilistic: Today’s young adults are also more interested in taking action than previous generations. From 2014 to 2021–22, an increasing number of high-school seniors agreed that protesting and voting could have “a major impact on how things are run in this country.” Voter turnout among young adults has been higher among Gen Z than previous generations at the same age, and political protests appear to have become more frequent in the eight or so years since Gen Z arrived on college campuses.

That, of course, could yield positive changes. One of the most important American ideals, arguably, is that the American project is unfinished, and that society can be made better, generation by generation. Throughout U.S. history, discontent and even righteous anger have often been important correctives to overly broad or unthinking sentiments about the country’s goodness, which, when unchallenged, can perpetuate injustices.

But many of Gen Z’s members seem convinced that radical change is necessary—to the model of government, to the economy, to the culture. In a 2020 poll I analyzed for my book Generations, three out of four American Gen Zers—more than any other generation—agreed that “significant changes” were needed to the government’s “fundamental design and structure.” Nearly two-thirds believed that America was not “a fair society,” again a higher rate than older adults. In a 2018 Gallup poll, more 18-to-29-year-olds had a positive view of socialism (51 percent) than of capitalism (45 percent). Some of the ideals, and idealism, that were commonly accepted in previous generations seem to have a looser hold over young adults today.

Why has Gen Z turned so definitively toward disillusionment and away from seeing their country as superior?

One reason may be their mental health: Twice as many teens and young adults are depressed than in the early 2010s. This is a tragedy—and it’s likely to have wide-reaching effects. Depression isn’t just about emotions; it’s also about cognition. By definition, depressed people see the world in a more negative light. They are less likely to see the positive, including in their country. Increases in depression are larger among liberals, consistent with the larger decline in their belief in American exceptionalism.

Changes in news consumption may also play a part. When newspapers were read on paper, all of the news—positive and negative—was printed together. Now negative news is king. Negative articles are almost twice as likely to be shared on social media as positive articles. Social-media algorithms push angry and divisive content. With Gen Z getting most of its information online, it is viewing the country through a negatively skewed funhouse mirror.

A third reason may lie in the shifts in high-school American-history curricula. Some—typically liberal—states now spend more time than they once did on the more deplorable facts of the nation’s history, such as the internment of Japanese Americans during World War II, the massacre of Native Americans, and the Founders’ ownership of slaves. That coverage lays out facts students need to know, but, especially if these events are emphasized more than the country’s more noble endeavors, it may also undermine feelings of national pride.

[Read: Are Gen Z men and women really drifting apart?]

Finally, Gen Z’s facility with social media may itself be coloring the generation’s views. Gen Z has learned that making a problem look as big and awful as possible is a highly effective way of getting traction on social media. Many problems are often portrayed as profound and systemic, fixable only by fundamental rethinks and institutional purges. It makes everything seem worse than it is.

My worry, as a social psychologist who has studied all of the living American generations, is that these various forces—and the pessimism they have generated —could move Gen Z to change systems that are not necessarily broken. That’s especially relevant as this generation comes of age and rises toward political power. Despite the common perception that the system is “rigged” and young people will never attain the wealth Boomers did, for instance, the Federal Reserve of St. Louis recently found that Millennial and Gen Z young adults actually have 25 percent more wealth than Boomers did at the same age. Inflation-adjusted median incomes for American young adults are at all-time highs, and poverty rates for children and younger adults are lower than they were in the early 2000s. The social-media-driven negativity machine may have prevented Gen Z—and all of us—from seeing the good news.

Just as the positive views we have about our individual selves may be exaggerated, the idea that the United States is uniquely superior is also, at least in part, an overly optimistic illusion we tell ourselves as a country. But like our positive self-illusions, patriotism also has its benefits, including a more satisfied citizenry and more political stability. With Gen Z unconvinced of the country’s exceptionalism and willing to take action, the U.S. may, in the coming decades, witness an era of extraordinary political change.

The End of Parallel Parking

The Atlantic

www.theatlantic.com › technology › archive › 2024 › 10 › end-of-parallel-parking-robotaxi › 680276

For decades, my dad has been saying that he doesn’t want to hear a word about self-driving cars until they exist fully and completely. Until he can go to sleep behind the wheel (if there is a wheel) in his driveway in western New York State and wake up on vacation in Florida (or wherever), what is the point?

Driverless cars have long supposedly been right around the corner. Elon Musk once said that fully self-driving cars would be ready by 2019. Ford planned to do it by 2021. The self-driving car is simultaneously a pipe dream and sort of, kind of the reality of many Americans. Waymo, a robotaxi company owned by Alphabet, is now providing 100,000 rides a week across a handful of U.S. cities. Just last week, Tesla announced its own robotaxi, the Cybercab, in dramatic fashion. Still, the fact remains: If you are in the driver’s seat of a car and out on the road nearly anywhere in America, you are responsible for the car, and you have to pay attention. My dad’s self-driving fantasy likely remains far away.

But driving is already changing. Normal cars—cars that are not considered fancy or experimental and strange—now come with advanced autonomous features. Some can park themselves. You can ask your electric Hummer to “crab walk” into or out of a tight corner that you can’t navigate yourself. It seems that if you are on a bad date and happen to be sitting on a restaurant patio not too far from where you parked your Hyundai Tucson SEL, you can press a button to make it pull up beside you on the street, getaway-car style. It’s still hard to imagine a time when no one needs to drive themselves anywhere, but that’s not the case with parallel parking. We might be a generation away from new drivers who never learn to parallel park at all.

It makes sense that the task would be innovated away. Parallel parking is a source of anxiety and humiliation: David Letterman once pranked a bunch of teenagers by asking them to try to parallel park in Midtown Manhattan, which went just as hilariously poorly as you might expect. Parallel parking isn’t as dangerous as, say, merging onto the highway or navigating a roundabout, but it’s a big source of fear for drivers—hence a Volkswagen ad campaign in which the company made posters for a fake horror movie called The Parallel Park. And then it’s a source of pride. Perfectly executed parking jobs are worthy of photographs and public bragging. My first parallel park in Brooklyn on the day I moved there at 21 was flawless. I didn’t know about alternate-side parking, so I ultimately was ticketed and compelled to pay $45 for the memory, but it was worth it.

Whether or not you live in a place where you have to parallel park often, you should know how to do it. At some point, you will at least need to be able to handle a car and its angles and blind spots and existence in physical space well enough to do something like it. But this is an “eat your vegetables” thing to say. So, I thought, the best people to look at in order to guess how long we have until parallel parking is an extinct art might be the people who don’t already have a driver’s license. According to some reports, Gen Z doesn’t want to learn how to drive—“I’ll call an Uber or 911,” one young woman told The Washington Post. Those who do want to learn have to do so in a weird transitional moment in which we are still pretending that parallel parking is something a human must do, even though it isn’t, a lot of the time.

I talked with some longtime driving-school instructors who spoke about self-parking features the way that high-school English teachers talk about ChatGPT. The kids are relying on them to their detriment, and it’s hard to get them to form good habits, said Brian Posada, an instructor at the Chicago-based Entourage Driving School (not named after the HBO show, he said). “I’ve got some students who are really rich,” he told me. As soon as they get their permits, their parents buy them Teslas or other fancy cars that can self-park. Even if he teaches them how to parallel park properly, they will not practice in their own time. “They get lazy,” he told me.

Parallel parking isn’t part of the driver’s-license exam in California, though Mike Thomas still teaches it at his AllGood Driving School. His existential dread is that he will one day be less like an educator and more like the person who teaches you how to use your iPhone. He tells teens not to rely on the newfangled tools or else they will not really know how to drive, but he doesn’t know whether they actually buy in. “It’s hard to get into the minds of teenagers,” he said. “You’d be amazed at how good teenagers are at telling people what they want to hear.” Both instructors told me, more or less, that although they can teach any teen to parallel park, they have little faith that these new drivers will keep up the skill or that they will try on their own.

Teens are betting, maybe correctly, that they soon may never have to parallel park at all. Already, if you live in Austin or San Francisco and want to avoid parallel parking downtown, you can order an Uber and be picked up by a driverless Waymo. But autonomous parking is much simpler to pull off than fully autonomous driving. When I pushed Greg Stevens, the former chief engineer of driver-assistance features at Ford, to give me an estimate of when nobody will have to drive themselves anywhere anymore, he would not say 2035 or 2050 or anything else. He said he would not guess.

“The horizon keeps receding,” he told me. Stevens now leads research at the University of Michigan’s Mcity, a huge testing facility for autonomous and semiautonomous vehicles. Most driving, he said—99.9 percent—is “really boring and repetitive and easy to automate.” But in the final .1 percent, there are edge cases: “things that happen that are very rare, but when they happen they’re very significant.” That’s a teen whipping an egg at your windshield, a mattress falling off the back of a truck, a weird patch of gravel, or whatever else. “Those are hard to encapsulate completely,” he said, “because there’s an infinite number of those types of scenarios that could happen.”

In many ways, people are still resisting the end of driving. One guy in Manhattan is agitating for a constitutional amendment guaranteeing human beings the “right to drive,” if they so choose, in our autonomous-vehicle future. It can be hard to predict whether people will want to use new features, Stevens told me: Some cars can now change lanes for you, if you let them, which people are scared to do. Most can try to keep you in your lane, but some people hate this a lot. And for now, self-driving cars are just not that much more pleasant to use than regular cars. On the highway, the car tracks your gaze and head position to make sure your eyes stay on the road the entire time—arguably more depressing and mind-numbing than regular highway driving.

Many people don’t want a self-parking car, which is why Ford has recently paused plans to put the feature in all new vehicles. I hate driving because it’s dangerous, but I am good at parallel parking, and I’m not ready to see it go. It’s the only aspect of operating a vehicle for which I have any talent. I don’t want to ease into a tight spot without the thrill of feeling competent. Parallel parking is arguably the hardest part of driving, but succeeding at it is the most gratifying.

If parallel parking persists for the simple reason that Americans don’t want to give it up, fully self-driving cars may have little hope. A country in which nobody has to change lanes on a six-lane highway or park on their own is a better country, objectively. I also spoke with Nicholas Giudice, a spatial-computing professor at the University of Maine who is working on autonomous vehicles with respect to “driving-limited populations” such as people with visual impairments or older adults. Giudice is legally blind and can’t currently drive a car. He said he would get in the first totally self-driving car anybody offered him: “If you tell me there’s one outside of my lab, I’ll hop into it now.”

Conventional parallel parking—sweating, straining, tapping the bumper of the car in front of you, finally getting the angle right on the 40th try—won’t have to disappear, but it could become part of a subculture one day, Giudice said. There will be driving clubs or special recreational driving tracks. Maybe there will be certain lanes on the highway where it would be allowed, at least for a while. “You can’t have 95 percent autonomous vehicles and a couple of yahoos driving around manually,” he said. “It will just be too dangerous.”

Am I a yahoo for still wanting to parallel park? I can mollify myself with a fantasy of parallel parking as not a chore but a fun little game to play in a closed environment. I can picture it next to the mini-golf and the batting cages at one of those multipurpose “family fun” centers. There’s one near my parents’ house where you can already ride a fake motorcycle and shoot a fake gun. My dad could drive me there with his feet up and a ball game on.

We’re Still Living in a Fight Club World

The Atlantic

www.theatlantic.com › culture › archive › 2024 › 10 › fight-club-25th-anniversary › 680231

Fight Club, David Fincher’s arch 1999 study of disaffected men, presents male rage as a subculture. The layered neo-noir film, adapted from Chuck Palahniuk’s novel of the same name, offers angry young men rituals, language, and an origin story for their fury. “We’re a generation of men raised by women,” pronounces Tyler Durden, a peacock of a character played by Brad Pitt. The line is true of all generations, but Tyler, a soap salesman who becomes the spiritual leader of these aggrieved dudes, delivers it as a revelation.

Though the film addresses the woes of Gen X, in the 25 years since it was released to polarized reviews and low ticket sales, Fight Club has burrowed deeply into American culture. Its dialects of secrecy (“The first rule of Fight Club is you do not talk about Fight Club”) and insult (“You are not a beautiful or unique snowflake”) have seeped into casual conversation and politics. Pitt’s sculpted physique remains a fitness ideal, and his virile performance is worshiped by pickup artists and incels. And of course, actual fight clubs have sprung up, stateside and across the world.

Fight Club’s insights about the consequences of men rallying around resentment remain apt today, a period in which Donald Trump’s grievance politics and the growing swamp of the manosphere are shaping American masculinity. Amid its frenzied storytelling, the film offers a cogent theory of modern masculinity: Men suck at communicating. We see this idea most clearly in the constant evasions of the unnamed Narrator, an insomniac office worker played by a haggard and numbed Edward Norton; a cipher throughout the film, he eagerly adopts Tyler’s macho swagger to avoid facing his insecurities. The famous twist, that he and Tyler are one and the same—and that Pitt’s character is a mirage—is the culmination of his deception. The Narrator is so unused to expressing himself that he doesn’t even recognize his own desires and fantasies. He has to sell himself his own anger.

The film quickly establishes the Narrator’s emotional reticence. Prone to digression and omission, the Narrator is elusive despite his constant chattering. His wry descriptions of his IKEA furniture, business travel, and chronic sleep deprivation establish the detached mood of the film, which presents late-20th-century America as an immersive infomercial. His irony-tinged voiceover, which Fincher pairs with images inspired by commercials and music videos, is more performance than disclosure. The capitalist fog of the Narrator’s life is so thick that he struggles to tell his own story, channel surfing through his memories.

In the beginning, the Narrator briefly escapes his insomnia by attending gatherings of people with terminal and debilitating illnesses. Always bearing a name tag with an alias, an early indicator of his evasive nature, he keeps mum as he sits among people with testicular cancer, sickle-cell anemia, and brain parasites. His silence makes them assume he’s at death’s door and shower him with affection—which helps him get the best sleep of his life. This holds him over until he realizes Marla, a fellow attendee played by a quirky and gothic Helena Bonham Carter, is also a phony leeching off the unwell, a discovery that breaks his morbid simulation of intimacy. He confronts her and learns she, too, is lonely and depressed, but decides to push her away rather than bond over their mutual ennui. When they exchange numbers to divide up the meetings so they never see each other, the Narrator tellingly does not share his name. He fears vulnerability.

[Read: TV’s best new show is a study of masculinity in crisis]

The Narrator seems to open up when he befriends Tyler, whom Pitt plays as a dotty sage. They first meet on a flight and reconnect after the Narrator loses his painstakingly furnished condo and a cherished wardrobe of DKNY and Calvin Klein duds to a freak explosion. Tyler’s garish outfits and lucid maxims (“The things you own end up owning you”; “self-improvement is masturbation”) cut through the dreary consumerist haze of the Narrator’s life and encourage him to let go, live a little, start over. Key to Tyler’s wisdom is violence, which becomes the pair’s lingua franca after they slug each other outside of a bar. They are so smitten after that first bout that the Narrator moves into Tyler’s decrepit house, trading a bourgeois life for monkish minimalism. That this apparent enlightenment leads to bloody basement fistfights is among the film’s core ironies.

Fight Club, as the two deem it once other men begin to join their weekly bare-chested scraps, is supposed to offer catharsis and connection. It’s meant to free participants from the monotony and humiliation of wage work. But it worsens their marginalization. Instead of learning to express and maybe resolve their anguish, they revel in it, beating each other senseless and flaunting their scars and bruises to the uninitiated; they graduate to juvenile acts of vandalism and, ultimately, terrorism. To its detriment, Fight Club is a fraternity of silence: With its rigid rules and subterranean locations, it constricts its members’ ability to express themselves.

The Narrator’s realization that he and Tyler are the same person—even though the connection is right in front of him—changes how he sees Fight Club. There are hints throughout the film, from moments in which Tyler blips into a scene before he’s introduced, to winking lines of dialogue like, “The liberator who destroyed my property has realigned my perceptions,” which Tyler, the culprit, tells the Narrator to say to a detective investigating the condo explosion.

The biggest tells are Tyler’s insistence that the Narrator never let Marla know about Fight Club, and the fact that she and Tyler never appear in the same room. She seems to threaten Tyler’s flashy machismo. She’s not closed off, like the men of the story. She actually says when she’s flustered, or happy, or aroused—an openness that’s anathema to the stoic Fight Club code. When the Narrator “kills” Tyler by shooting himself in the mouth, the target is very intentional. “You met me at a very strange time in my life,” he tells Marla when she sees the wound. He’s smiling though, happy to, finally, be speaking for himself.

Despite the Narrator’s tragic arc, the allure of Fight Club for many of its male viewers has always seemed more rooted in its gauzy depiction of bros letting loose than the pitfalls of emotional repression. When the film first came out, both positive and negative reviews focused on its violence: One critic described the film as “dangerous” because of the “extremely seductive” Fight Club scenes; another called it “nasty, impossible to turn away from.” In a pan, Roger Ebert called it “the most frankly and cheerfully fascist big-star movie since Death Wish … macho porn.”

That reception is inseparable from major events of 1999. The film came out months after the Columbine school shootings and the disastrous Woodstock ’99 festival, two high-profile instances of male violence. That year, entertainment became a scapegoat for America’s “culture of violence,” as then-President Bill Clinton frequently described it. The other—and often unstated—reason that the film seems to have made some critics tug their collars is that the Fight Club participants are mostly white. Their open bloodlust, shaved heads, and clandestine rituals evoke many strains of white supremacy, from neo-Nazis to skinheads to frat houses to citizens’ councils. The film certainly plays with fire.

That laddish appeal is misdirection, though. Fincher makes clear that this loser subculture is self-destructive and uncool. The bouts are brutish and styleless. The movie doesn’t offer the feats of wonder of sports or martial-arts films, where characters use techniques and disciplines to unlock their potential. Nor does it offer the adrenaline rush of action cinema. The story spends more time in Tyler’s house than in the ring—a domesticity suggested by the Narrator when he winkingly notes that outside of Fight Club, “We were Ozzie and Harriet.” That cohabitation heightens the irony of men never learning to speak up or adopt a language other than violence.

Most of the film’s odes to brotherhood and spiritual awakening are mocking in this way. One of Tyler’s best (and least quoted) lines from Fight Club lays out his dopey masculine idyll:

In the world I see, you’re stalking elk through the damp canyon forest around the ruins of Rockefeller Center. You’ll wear leather clothes that will last you the rest of your life. You’ll climb the wrist-thick kudzu vines that wrap the Sears Tower. And when you look down, you’ll see tiny figures pounding corn, laying strips of venison in the empty carpool lane of some abandoned superhighway.

Ah yes, corn, hunting, leather, ruins, skyscrapers—now, that’s manhood!

Why do so many men embrace these tired scripts and props? One of the strengths of Fight Club is that it rejects the idea that men are pathologically distressed and inclined toward violence. Although the Narrator does technically commit self-harm throughout the film, he’s never diagnosed with anything other than insomnia. As real as his alienation is, the implication is that he chooses to withdraw into himself and push away the people who might care for him. Fight Club is his man cave.

[Read: The changing sound of male rage in rock music]

Underscoring his willed isolation is the fact that Fight Club intentionally seems to take place nowhere. Though it was clearly filmed in Los Angeles, the addresses shown on documents in the movie are obviously bogus, listing a six-digit zip code or “Bradford, UN” as their city and state. The name of the local police department is simply “Police Department”; likewise, a regional bus line is just titled “Direct Bus.” This ambient obscurity suggests masculinity is less a rulebook and more a state of mind, a mood, a feeling. The Narrator finds a more benign form of connection by the end, clasping Marla’s hand in the final scene. But his wayward journey to that moment is hilarious and telling. Unlike Tyler, Marla was there the whole time.

Fight Club is at heart a dry roast of masculinity, a burlesque of the models and habits with which men define and often destroy themselves to avoid emoting or being vulnerable. The film, like The Matrix, another 1999 bugbear, might be forever doomed to be misread, but it still resonates. The movie understands both the appeal of male angst and the hollowness of building a life around it. There’s a whole spectrum of other emotions, a wide range of activities beyond trading blows, and far more versions of manhood than “alpha,” “beta,” or “sigma.” Feeling distressed? It’s okay, dude; we can talk about it.

Why Kamala Harris Went on Call Her Daddy

The Atlantic

www.theatlantic.com › politics › archive › 2024 › 10 › kamala-harris-call-her-daddy-podcast › 680181

Very few podcasters would apologize to their fans for clogging up their feeds by interviewing a presidential candidate. But Alex Cooper—the host of a podcast variously described as “raunchy, “sex-positive,” “mega-popular,” and “the most-listened-to podcast by women”—is an exception. “Daddy Gang,” she began her latest episode, “as you know, I do not usually discuss politics, or have politicians on this show, because I want Call Her Daddy to be a place where everyone feels comfortable tuning in.”

Her guest was Kamala Harris, and Cooper had decided to speak with the Democratic nominee because “overall, my focus is women and the day-to-day issues that we face.” Their 40-minute conversation covered Harris’s upbringing, the rollback of abortion rights, the high cost of housing, and Republican attacks on “childless cat ladies.” This wasn’t a hard-hitting accountability interview, but it did contain a substantive policy discussion—not that you would guess from some of the more overheated right-wing attacks, which seemed to think the pair were braiding each other’s hair. After a summer of largely avoiding interviews with mainstream news outlets, the Harris campaign—like Donald Trump’s—is seeking out friendly podcasters that are popular with normie audiences. As a journalist, I wish both campaigns were doing more tough interviews. But as a pragmatist, I realize that hard-news shows do not command the audiences they once did. Also, most Americans who consume a lot of news already know how they’re going to vote. Nailing down undecided voters—including those who don’t currently plan to cast a ballot—is vital. And if that means going on podcasts hosted by YouTube pranksters turned wrestlers (as Trump did) or ones with past episodes like “Threesomes, Toxic Men and OnlyFans” (as Harris did), so be it.

If you haven’t heard of Call Her Daddy, please accept my condolences for being old, or male, or otherwise uncool. (I was in the first group until I binge-listened in preparation for the Harris interview.) The show had the second-biggest audience among podcasts on Spotify last year, after The Joe Rogan Experience. Recent guests include Miley Cyrus, Avril Lavigne, Katy Perry, and Simone Biles. Young women love “Father Cooper” and listen to what she says.

[Read: Kamala Harris’s biggest advantage]

That Cooper chose to begin with an apology is interesting—not least because it suggests that Team Harris courted her, rather than the other way around. In February, Cooper told The New York Times that she had resisted overtures from the White House to have Joe Biden as a guest. “Go on CNN, go on Fox,” she said. “You want to talk about your sex life, Joe?”

Although Harris didn’t talk about hers, she did talk about tampons, agreeing with Cooper that many of the male politicians who make abortion laws seem to have only the sketchiest understanding of female biology. In fact, this campaign has featured 100 percent more tampons than I expected, because the online right has been trying to make the nickname “Tampon Tim” happen for Harris’s running mate, Tim Walz. (As governor of Minnesota, he signed a law that would provide free menstrual products in both boys’ and girls’ school bathrooms.)

Harris also spoke about how she was the first vice president to visit a reproductive-health clinic, allowing her to argue that Republican abortion restrictions, by forcing those clinics to shut down, also limit women’s access to pap smears, contraception, and breast-cancer screenings. She discussed the death of Amber Thurman, who developed blood poisoning after having to leave Georgia to seek an abortion shortly after a state law tightly restricting the procedure took effect. Republican proponents of that law had claimed that terminations could be permitted to save the life of the mother, Harris said, anger creeping into her voice: “You know what that means, in practical terms? She’s almost dead before you decide to give her care.” Whoever coached Harris out of being the word-salad-monger of the 2019 Democratic primary, or the snippy flubber of her disastrous 2021 interview with Lester Holt, deserves a raise.

The people criticizing Harris’s Call Her Daddy appearance have claimed that it was demeaning and unserious—or, at best, pointless. Young women are deemed to be in the tank for the Democrats already—the gender gap in this election is real. But Cooper reaches an audience that does not follow politics closely, and her own background is more small-c conservative than you might imagine from the podcast’s empowered-raunch vibe. She was raised Catholic, in Pennsylvania, and her story follows a familiar pattern for Gen Z and Millennials: After spending her 20s keeping “dick appointments,” as she has put it, she met a film producer who later proposed by turning their house into a scavenger hunt full of moments from their relationship, and the couple had a big white wedding in Mexico.

Call Her Daddy, which began as part of the notoriously fratty Barstool Sports network, has mellowed along with Cooper. Its listeners are neither anarchist feminists nor aspiring tradwives, but the great middle of American Gen Z straight(ish) women, who think sex before marriage is fun but also dream of settling down with Mr. Right. This group definitely leans Democrat, but Cooper’s Barstool connection means there will be conservatives listening too, as well as many women who might not vote at all. The Republicans are struggling with this group of voters, seeing them as more radical than they really are, while some evangelical leaders even hope the abortion bans will be a disincentive to premarital sex. But most young women intuitively understand that their sexual and economic freedom are linked: They make their own money, so they can date who they want.

Cooper’s apology also intrigued me because she followed it up with some self-deprecating pablum about her unfitness to ask questions about fracking and border control. Trump has just completed his own podcast tour, talking with influencers, such as Logan Paul, Lex Fridman, and Theo Von, who are popular with young men. Let me shock you: These guys did not seem worried about their knowledge of the Middle East or the finer points of drug policy. But women are not supposed to get above themselves, even though the entire interview-podcast circuit runs on feigned expertise and overly confident opinions. Cooper’s self-deprecation is a reminder why Harris has tried to downplay the historic possibility of being the first female president—because she knows that many voters still find female ambition unsettling.

[Read: What the Kamala Harris doubters don’t understand]

Still, this interview is the most barbed I’ve seen Harris allow herself to be on the topic of her own ambition. Cooper asked her about Arkansas Governor Sarah Huckabee Sanders’s comments that “Kamala Harris doesn’t have anything keeping her humble,” because she doesn’t have biological children. How did that make the vice president feel? “I don’t think [Sanders] understands that there are a whole lot of women out here who, one, are not aspiring to be humble,” Harris replied. Also, she went on, “a whole lot of women out here … have a lot of love in their life, family in their life, and children in their life, and I think it’s really important for women to lift each other up.” Pressed on J. D. Vance’s claim that the Democrats were dominated by “childless cat ladies,” Harris said: “I just think it’s mean.”

Trump’s continued electoral success has inspired many pundits to claim that there are no longer any standards of decency in American public life—and that politicians can therefore say what they like. In reality, parts of Harris’s story are likely to resonate with voters. Harris’s stepchildren came up in the interview Harris did last week on All the Smoke, a sports podcast hosted by two former NBA players. “I love those children—they are my children,” Harris said of her husband’s kids, adding that she had worked hard not to undermine their mother. One of the hosts, Matt Barnes, sympathetically noted that he is a stepfather to three children. At a time when the GOP really wants to be talking about the economy and the border, the attack line about Harris’s family life is what’s coming through on podcasts for Gen Z women and (predominantly male) sports fans.

My hunch is that lots of parents do secretly think it’s weird not to want kids, but they also know people who have been devastated by infertility, and so find it graceless to imply that nonparents are hollow droids. And in any case, Harris has a ready answer to the implicit charge of being a heartless shrew—on Call Her Daddy, she once again talked about how her stepkids call her “Momala.”

As the campaign enters its last month, Harris is taking on more interviews and public appearances. This week, she has a Univision town hall, and will be on The Late Show With Stephen Colbert, The View, Howard Stern’s radio program, and 60 Minutes. In other words, after targeting Gen Z women, she’s turning to the other key parts of the Democratic base: Hispanic voters, coastal liberals, suburban women, sexually liberated Boomers, and people who care about foreign policy. It’s a smart tactic—and the mirror image of the campaign choices that Trump made months ago.

What’s the Appeal of Indie Rock’s New Golden Boy?

The Atlantic

www.theatlantic.com › culture › archive › 2024 › 10 › mj-lenderman-indie-rock › 680107

The great musical mystery of the year, for me at least, has been all the hype around a 25-year-old singer-songwriter named MJ Lenderman. He is “often described—accurately—as the next great hope for indie rock,” The New Yorker’s Amanda Petrusich wrote recently. I like Lenderman, but his pleasant, country-inflected new album, Manning Fireworks, certainly doesn’t scream next anything. It almost could have been released in 1975, or 1994, or 2003.

Petrusich’s article made something click for me, though. She defined indie rock as “however one might now refer to scrappy, dissonant, guitar-based music that’s unconcerned, both sonically and spiritually, with whatever is steering the Zeitgeist.” She then said Manning Fireworks “could have been released in 1975, or 1994, or 2003” … but in a good way.

Such is the manner in which Lenderman has generally been praised: as a restorer, a throwback, a reassuring archetype. The North Carolina native plays guitar and sings backup in the genre-bending band Wednesday, but his solo music—laid-back, witty, tuneful while noisy—seems designed to trigger déjà vu. He fits in a clear lineage stretching back through mysterious slackers such as Mac DeMarco, Pavement, and R.E.M. to the Boomer goddaddies of wry disaffection: Neil Young, Bob Dylan, the Velvet Underground. In a glowing review of Lenderman’s new album, the critic Steven Hyden wrote, “As a young, curly-haired brunet dude, he made exactly the kind of music you would expect from a young, curly-haired brunet dude.” Patterson Hood of Drive-By Truckers told Rolling Stone, “He checks all the boxes for me.”

This is going to sound earnest in that intolerably Millennial way, but: Isn’t box-checking not very indie? When I first dabbled in hipsterdom, in the early 2000s, Wilco was defacing folk guitars with electronic chaos, Animal Collective was inventing barbershop psychedelia, and Joanna Newsom was writing supernatural symphonies on her harp. What united these artists wasn’t commercial independence—some were on major labels—but rather their belief that authenticity arose from experimentation. Whereas normie genres such as country and mainstream rock seemed to be chasing faded glory, acclaimed indie acts honored their influences by pushing their ideas further: Think of Sonic Youth intensifying John Cale’s drones to screeching frequencies, Modest Mouse’s yelpy profundity emerging from the Pixies’ yelpy absurdity, and so on.

For more than a decade now, though, that sense of forward movement has been harder to detect—because it’s not been quite as rewarded as it once was. When Spotify came to America in 2011, it decimated the profitability of recordings and overwhelmed the public with choice. It also empowered listeners in ways that eroded the importance of music critics, record stores, and real-life scenes. Tidy narratives of progress—always somewhat fictive, useful to journalists and publicists more than to consumers and artists—started to degrade. Prestige, based on a few pundits’ idea of boundary-pushing genius, stopped paying the bills like it once did (because people stopped shelling out for buzzy music without hearing it first). Die-hard fandom became crucial (the trendy phrase for this is parasocial relationship). This confluence of factors influenced indie rock much as it influenced the mainstream: by making identity more important.

The most discussed indie-rockers of the past decade were thus singer-songwriters with strong points of view, such as Mitski, Waxahatchee, Soccer Mommy, and Bartees Strange. The breakout bands tended to be glorified solo projects (Japanese Breakfast, Tame Impala, the War on Drugs) or, in the case of Haim, a sisterly trio ripe for stanning. As the media caught up to the internet’s amplification of long-marginalized voices, issues of race, gender, and sexuality became more explicit in the critical conversation. All of these new stars were serious talents, and all of them did, in various small ways, innovate; the layered and whispery vocal style of Phoebe Bridgers, for example, has proved influential. But in general, the progression of indie in the streaming era can be tracked less through sound than through the question of who’s singing and what they’re singing about.

Of course, indie rock—like any musical tradition—has always been rooted in questions of identity. It’s just that in the past, the default identity tended to be a white guy who’s only comfortable revealing himself through cryptic poetry, buried under aural distortion. Stephen Malkmus and Jeff Tweedy absolutely wrote about their own maleness, but most listeners and critics didn’t focus on that. Now, when identity has moved from cultural subtext to text—and indie rock has come to seem more like a settled language of self-expression than an unruly journey into the unknown—the next big thing seems oddly familiar: a man, in a once-male-dominated genre, singing about being a man.

The cover of Lenderman’s 2021 album, Ghost of Your Guitar Solo, features a photo of a naked guy holding a cat, framed by happy-faced stars and moons. It was a fitting statement of winsome, self-exposing masculinity—of a bro who knows he’s babygirl.

Stylistically, the cover also conveyed his musical approach: concise, funny, building layers of meaning through simple juxtapositions. Much of Lenderman’s early work made him out to be a lo-fi magpie, pairing wonky riffs with understated punch lines delivered in a flat, vaguely fearful drawl. On Guitar Solo’s “I Ate Too Much at the Fair,” Lenderman encapsulated an entire relationship—who cares for whom, who spends and who saves—in one couplet: “I ate too much at the fair / Despite what you said.” Gobs of reverb, with sweetness at the edges, conveyed his lovelorn bloat.

That album and his breakthrough follow-up, 2022’s Boat Songs, felt rooted in what you might call the “woke first person,” situating individual desires with an anxious nod to the society around him. In one song, he fantasized about becoming a Catholic priest so he wouldn’t have to worry about girls anymore. Another, the rollicking “Hangover Game,” used an anecdote about Michael Jordan to probe his own drinking habit. I always laugh at “Inappropriate,” whose noodling organ sounds like the Doors being recorded from the other side of a wall:

Accidentally saw your mother
Sleepin’
She looked so peaceful and disgusting

It felt inappropriate
To catch her like that
I never want to see her sleep again

Manning Fireworks, his new album, shifts the perspective a bit: He now often seems to be singing about other guys. Lenderman told The New York Times that some of the album’s lyrics were inspired by misogynistic podcasters such as Andrew Tate, who preach an alpha, acquisitive view of how men should behave. The album is at its best when it links sorrow and pigheadedness, suggesting that the contemporary Problem With Men has something to do with the heartbreak and impotence that rockers like Lenderman have long plumbed (he sings tenderly of one character “punching holes in the hotel room”). At times, though, Lenderman is as predictable as a political cartoonist, employing glib ironies to mock smartwatches and guys who rent Ferraris after a breakup.

These themes are modern—listen closely, and the album actually couldn’t have come out in 1975, 1994, or 2003—but the album’s sound is not. Lenderman is now making blast-at-a-barbecue Americana, bedecked in pedal steel and tragic-hero guitar solos. Some elements hit the ear as unexpected: doomy riffing in “Wristwatch,” drifting clarinet in “You Don’t Know the Shape I’m In,” the rumbling uplift of “On My Knees.” Yet fundamentally, the album feels unmoored, assembled through reference points. Although the music scans as the work of a full band, it makes sense that Lenderman played most of the instruments: This is one rock geek’s modest vision, unimpeded. Lenderman’s skills aren’t debatable, and when I watch videos of him performing with his heavy-lidded eyes and boyish smirk, I get why people are obsessed. But if this is the next great hope for indie rock, then indie rock is becoming a costume closet.

Luckily, other contenders exist for that title, and one of them is Lenderman’s own band, Wednesday, a quintet founded in 2017. When I first listened to the group’s 2023 album, Rat Saw God, I felt a rush of recognition—not for any particular sound, but for the way Wednesday took for granted that its job was to break ground. The songs blended noise-rock and country into gnarled, surprising shapes. The lead singer Karly Hartzman—Lenderman’s now-ex-girlfriend—told tales of small-town life through sweet warbles and harsh screams. All five of the band members at the time were credited as songwriters, and all of the album’s songs seemed like they could have arisen only through a collision of creative minds.

Wednesday is part of a fascinating trend sweeping through Gen Z rock: a revival of shoegaze. The subgenre originated in the late ’80s as bands such as My Bloody Valentine blanketed concert venues in slow-churning guitar squall while staring down at their effects pedals. The new incarnation—check out the fearsome young trio Julie—draws not just from traditional shoegazers but also from heavy metal, emo, and even electronica. The trend can probably be attributed to TikTok’s demand for sounds that make banal images seem profound. But another reason might be a latent hunger for rock that’s abstracted, collaborative, and sensation-first. Shoegaze is, after all, a term for subsuming individual personalities into pure sound.

[Read: How indie rock changed the world]

Even outside of that fad, to my ear, many of the most exciting things happening in 2020s indie are bands. Recent consensus-masterwork albums have come from Dry Cleaning and Wet Leg, whose spoken-sung vocals enmesh with spry, unpredictable post-punk; Turnstile, a hard-core act that veers into dance music and power pop; and Big Thief, whose ornate folk jams radiate sci-fi eeriness. The state of the music industry—especially after the dangers and disruptions of COVID-19—is broadly discouraging of bands: Groups are just more expensive and harder to market than solitary figures. But if indie rock means anything, it means trying to carve out a refuge from the forces shaping the mainstream.

And make no mistake: If indie mostly defines itself around solo stars, pop will devour its last shred of differentiation. The streaming years have seen tremendous evolution in the sound of mass-market music, in part because identity-based imperatives have pushed the world’s biggest entertainers to act more underground. Inspired by the alt-mainstream bridge-builder Lana Del Rey, Taylor Swift and her protégés have started to employ indie-rock producers to furnish them with classic signifiers of authenticity. Listening to recent pop is like playing record-snob bingo, trying to identify the musical touchstones used to illustrate the singer’s confessional zingers. Much the same thing can be said of Manning Fireworks—and it’s likely no coincidence that Lenderman is getting memed in the same manner as a pop girlie.

Time for a confession that will make me sound like a parasocial hypocrite: I’m worried about Lenderman’s breakup. He and Hartzman were dating for years, and many of their songs chronicle their love. But they split recently (and—here’s more lore—moved out of the Asheville property where they and some other cool musicians lived). The breakup is apparently amicable: Lenderman is still in Wednesday, and the two just performed together on The Tonight Show. Still, with all the fame building around his solo career, it’s natural to wonder about the band’s fate. Speaking about Wednesday’s future, Hartzman recently told Rolling Stone, “There has to be a lot of change.” That’s scary as a fan—but then again, change is what a fan of music like this should want.