Itemoids

Great

How Much Should You Really Spend on a House?

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › new-housing-affordability-crisis › 674679

At the familiar, treacherous hour of 3 a.m., I wake up in a cold sweat, my heart galloping in my chest. I drink some water and take half an Ambien. Then I turn to a sacred document that comforts me in uncertain times. I’ve read it so often, I can practically recite it from memory: “No more than 28 percent of the borrower’s gross monthly income should be spent on housing costs,” says the article from Rocket Mortgage.

When I get these panic attacks, it’s often because a house has finally come up for sale in the neighborhood to which my partner and I are hoping to move. If we bid way over asking price, we could probably get it. But my nocturnal anxiety attaches itself to one question: Can we afford it? The Rocket Mortgage article can’t answer this question, but rereading it soothes me, its precise-sounding percentages sliding beneath my thumb like worry beads.

Versions of the “how much house can you afford?” article get published every few months, and they all tend to include the same few estimates. In addition to the 28 percent rule, there’s a different rule that says all your debts—including, most notably, a mortgage and student loans—shouldn’t exceed more than 35 percent of your income. (This means that if your mortgage is your only debt, your housing costs alone could conceivably eat up more than a third of your pay.) Under this rule, someone making $60,000 a year and with no existing debts could afford a mortgage of $1,750 a month, which currently equates to a home priced at about $250,000. Yet another rule, from the financial guru Dave Ramsey, recommends spending no more than 25 percent of your take-home pay on your mortgage. I’m not sure houses that cheap exist anymore: With interest rates at their highest point in recent memory, houses that were once within reach now cost hundreds more a month.

[Read: How a new jobless era will transform America]

Like many timeworn texts, mortgage-advice articles offer more parable than prescription. The numbers above are all wildly different; the discrepancy between them can represent thousands of dollars a month. Most of them don’t take into account things like 401(k) contributions or taxes, which can be high for people, like me, who are occasionally self-employed. And they don’t factor in other expenses, such as food and child care, that have shot up with inflation. At some of those percentages, my partner and I could afford a large house, but not a child to populate it with. Or we could have a nice kitchen, but would have to stop eating. They’re all significantly higher than what we spend on our current home, about 20 percent of our take-home pay.

I worry about being locked into a huge monthly payment—one that would make us unable to afford child care, or to weather a job loss or a downturn in the housing market. Throughout our home-buying “journey,” such as it is, various people have implied to us that real estate is a good investment, so we shouldn’t stress too much about buying an expensive house. But also that homes, through their upkeep, repairs, and various other vicissitudes, often end up costing more than you bargained for, so you should bargain conservatively. Which is right?

I interviewed nine real-estate experts to help me understand why the numbers vary so much and, I hoped, help me figure out the right one to use for myself. They confirmed that, yes, the mortgage-affordability numbers are all different, and though some lenders use them to approve mortgages, they are basically guesstimates. “To some extent, they’re plucked out of the air,” Robert Van Order, an economics professor at George Washington University, told me. “A lot of these numbers are pretty arbitrary,” added Edward Seiler, associate vice president of housing economics at the Mortgage Bankers Association. “It’s just based on people staring at data and thinking, What are the tipping points that force people into delinquency?” If the percentages don’t seem ironclad, it’s because they aren’t.

Okay, I said, then how much should a responsible person pay for their house? Forget the otherworldly figure that optimistic lenders might approve you for; how much should you actually spend? The experts seemed confused by this premise. “What does the word responsible even mean?” mused Morris Davis, a real-estate professor at Rutgers.

Spending more than 30 percent of your income on housing means you’re “cost burdened,” according to the federal government. After sufficient badgering, most of the experts coughed up this 30 percent figure, or about a third of your income, as a safe limit for your housing costs. But lenders sometimes approve buyers for house prices higher than that, and higher than they can realistically afford, explains Daryl Fairweather, the chief economist of Redfin. Instead of relying on calculators, Fairweather recommends that people comb through their accounts, add up all their expenses, and consider their housing budget to be whatever’s left over. (While good advice, this is trickier to do if you plan to change careers or have kids.) Another expert offered an interesting alternative: Look for a house that costs no more than two and a half times your annual income, which should help you fall below the 28 percent rule with an easier mental calculation.

Some, though, suggested that in this bonkers housing market, people should simply move somewhere cheaper so they don’t have to think as much about affordability. “There are these deeper questions out there that people aren’t asking, which is why don’t people want to move to lower-cost places?” Davis said. (Of course, during the coronavirus pandemic many people did move to cheaper places, like Austin and Miami, and subsequently made those places much more expensive.)

“Where are you based?” Davis asked me.

I told him the name of the Northern Virginia exurb where I live.

He pointed out that, even here, more than 25 miles from D.C., home prices are high. “Why?” he asked. “[That’s] kind of far from D.C. … You’re a writer; you could be anywhere.”

“This is the whole impetus for the story!” I shot back. We were, in fact, trying to relocate to Florida, a generally less expensive state where houses still cost more than we expected. “I’m trying to move!”

Despite hearing the 30 percent figure from many of the experts I talked with, I was surprised to learn that most current homeowners actually spend much less on their housing. So do most renters. The median homeowner with a mortgage spends 16 percent of their gross income on their house payment, including taxes and insurance. That number is higher—24 percent—for low-income households, but it’s still less than 30 percent. Renters spend an average of 26 percent of their income on housing. In other words, if you take the mortgage calculators at their word and spend 28 percent, you’re paying much more for a house than the average American does.

But in today’s market, it’s extremely difficult to buy a house for just 16 percent of your income—or 28, or 30. The average new homebuyer today, according to Zillow, will spend 34 percent of their income on housing—the highest amount since 2004, which is as far back as Zillow’s data goes. That’s if they have a 20 percent down payment. If they don’t, the cost burden will be even higher. Prices are still high because housing stock is so low: With mortgage rates at about 7 percent, people who locked in 3-ish-percent rates a few years ago aren’t moving. “It is clear that affordability has become the No. 1 challenge for new buyers and renters in the housing market today,” says Orphe Divounguy, a senior economist at Zillow.

The gravest danger of spending too much on a house is that, in the event of a personal or global catastrophe, you won’t be able to keep paying for it. That risk is, admittedly, modest: Even homeowners who spend up to 38 percent of their pay on their mortgage don’t tend to default, especially if they have good credit and put down a large down payment, according to research by Davis. But a large monthly payment could nevertheless prevent you from saving for retirement, maintaining an emergency fund, visiting far-flung family, or having as many kids as you want. It could keep you from indulging in the many pleasures of life that aren’t a house.

This possibility, of being “house broke,” can knead away at your thoughts as you click through DocuSign screens of enormous numbers. John Grable, a financial-planning professor at the University of Georgia, told me that during the post-2008 housing collapse, he and his wife lost six figures on their house. “It’s ingrained in my mind, in my being, not to lose money like that again,” he told me. I also remember those years. I remember my friends graduating from college and working at pizza places, my first boss’s voice trembling as he laid me off, people walking around with broken teeth because they didn’t have insurance. The Great Recession ingrained itself in my memory, too. I don’t know if I’ll ever be able to extract it.

Owning a home is generally considered financially smarter than renting, and that’s still true for many people. Unlike with a rental, the day you buy your home, your house payment is the largest it will ever be, assuming you have a fixed-rate mortgage. Your income will likely rise in time, but your mortgage won’t. Homeownership is still a major engine of wealth generation: When you sell a home, you might make a little money, but leaving an apartment, you definitely won’t. And of course, if you stay in your home all 30 years of the mortgage, you’ll own it free and clear. This is why many people, when they first buy a home, stretch financially, says Mike Loftin, the CEO of Homewise, a New Mexico–based organization that helps low-income people buy houses. Most of his customers, he says, spend more than 30 percent of their income on housing.

[Annie Lowrey: Everything is about the housing market]

But the staggering cost of homeownership today is making renting look less bad by comparison. Right now, for a typical home, owning costs about 25 percent more a month than renting, and there are only four metro areas where buying is currently cheaper than renting, according to Redfin: Detroit, Philadelphia, Cleveland, and Houston. And because interest rates are so high, today’s homebuyers are not building equity at the same clip that someone with a 3 percent rate would be. Especially if you don’t plan to stay in one city for at least three years, renting is reasonable. It’s just, by its nature, less permanent. “If you’re paying 500 bucks a month on your guest-house rent, like, yeah, don’t give it up,” Loftin says. “But be prepared that when the landlord’s kid moves back to Santa Fe, they’re gonna be moving in.”

The reason it’s so hard to get a straight answer on this—how much to spend? To buy or to rent?—is that buying a home is not purely rational. It’s also emotional, evoking feelings of stability and community, and potentially, of stasis and strain. The concept of “a home” can be comforting for some and smothering for others. As our physical selves take up residence, so do our hopes for the progression of our careers and the growth of our families. There’s not one right amount to spend, because there’s not one right future for everyone.

In fact, some experts suggested a different kind of mortgage calculator, what’s known as the “eight-hour rule”: “Don’t do anything where you’re not going to be able to sleep at night,” Davis said. By that measure, I might not buy a home for a while.

The Future of the “Great Resignation”

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 07 › the-future-of-the-great-resignation › 674669

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

The latest jobs data give a mixed picture of the economy—and raise questions about how America’s workers will fare.

First, here are three new stories from The Atlantic:

Joe Scarborough: “America is doing just fine.” The summer of Kramercore The writers who went undercover to show America its ugly side

Losing Ground?

In the spring of 2021, I traveled to Pennsylvania to attend a graduation. Driving around the area, I was struck by all the signs in diner and fast-food storefronts seeking workers. As I recall, the signs had a desperate tone, advertising bonuses and high wages to anyone willing to work. I was witnessing in real time a fascinating economic moment: Low-wage workers were in high demand, and that meant they were gaining leverage.

The signs I saw in Pennsylvania were emblematic of what was happening across the economy. Restaurants are a “microcosm” of the Great Resignation, the pattern that took off in 2021 in which workers quit their jobs to seek higher wages and better benefits‚ Nick Bunker, an economist at Indeed’s Hiring Lab, told me. That spring, as freshly vaccinated Americans went out to spend their stimulus checks, they frequented restaurants. Demand for services soared, and so in turn did the demand for service workers. Businesses had to compete for staff. And when workers saw that they could find better wages and conditions elsewhere, many quit their jobs in favor of new ones.

The latest jobs data suggest that workers might be losing some of this power. The economy added about 209,000 jobs in June, according to data from the Bureau of Labor Statistics released last week. It was the 30th consecutive month of job gains, but gains were at their lowest rate since the streak began. “The picture that emerged was a mixed one,” Julia Pollak, the chief economist at ZipRecruiter, told me. “Workers are still in the driver’s seat in many industries, other than tech, but they are losing leverage.” However, she added, the job market is “still more favorable to workers than before the pandemic.”

What’s happening in hospitality, a sector that includes restaurants and bars, tells us a lot about the job market more broadly. That was true in 2021, Bunker told me, and it’s still true now. Looking at the behavior of the hospitality sector in last week’s report, Bunker noted, we can see that “the labor market is moderating but still strong.”

As the job market softens somewhat, workers may be losing some of the leverage they gained when the market was tighter. As Ben Casselman reported in The New York Times last week, “The rate at which workers voluntarily quit their jobs has fallen sharply in recent months—though it edged up in May—and is only modestly above where it was before the pandemic disrupted the U.S. labor market.” When workers quit jobs, it reflects their confidence that they can find another, better job. Casselman reported that hourly earnings for hotel and restaurant workers rose 28 percent from the end of 2020 to the end of 2022, which was faster than the rates of both inflation and overall wage growth. But now, after surging in late 2021 and early 2022, growth for low-wage workers has slowed, and fewer workers in the hospitality industry are separating from their jobs now compared with the same period last year.

This slowing wage growth could be seen as a sign that workers are losing ground. But another possible reason that wage growth has slowed, Bunker explained, is that many workers’ base pay has gone up compared with a couple of years ago. Employers are “giving raises off a wage rate that has risen a lot since the spring of 2021,” Bunker said.

The Fed will be happy to see the job market cooling off, Bunker told me, so we might see fewer interest-rate hikes in the months to come: “Reduced competition for workers is going to reduce wage growth, which is—in the Fed’s view—going to put less pressure on employers to raise prices, so that should bring inflation down.” But after pausing their hikes last month, following 10 consecutive rate hikes, the Fed is still widely expected to raise rates at its meeting at the end of this month.

The monthly job-openings report tells us more about the recent past than it does about our current reality. The patterns we saw in last week’s numbers contain new information about a moment that’s already slightly dated. And they raise fresh questions about whether the Great Resignation is over. Bunker, for his part, riffed on Mark Twain, saying that in his opinion, “rumors of the Great Resignation’s demise are greatly exaggerated.” But, he added, in a few months, we may be able to say more definitively whether the heyday of the Great Resignation really is behind us.

Related:

The happiest way to change jobs Low-wage jobs are becoming middle-class jobs.

Today’s News

The Kremlin stated that Wagner chief Yevgeny Prigozhin, whose location remains unclear, met with Vladimir Putin after last month’s failed mutiny. Joe Biden began his trip to Europe by meeting with U.K. Prime Minister Rishi Sunak to show unity ahead of a NATO summit that will likely be divided over how to support Ukraine. Dangerous triple-digit heat will affect more than 35 million people in the South and southwestern United States this week.

Dispatches

Famous People: Lizzie and Kaitlyn go to an indie-sleaze 31st-birthday party and learn that the semi-ironic theme party is a delicate art. Up for Debate: Conor Friedersdorf gathers readers’ thoughts on affirmative action.

Explore all of our newsletters here.

Evening Read

Eleni Kalorkoti

The Secret Power of Menopause

By Liza Mundy

Don’t try to tell this to a mother sitting in the bleachers during a four-hour swim meet; or enduring a birthday party involving toddlers and craft projects; or resting in an armchair on a peaceful evening, savoring the heft of a tiny body and the scent of an infant’s freshly washed hair. Interminable or sweetly languid though they may feel in the moment, the childbearing years are startlingly brief. Fertility, which typically ends in a woman’s mid-40s, occupies less than half of her adult life. And then, if she’s lucky, she has 30 or 40 years in which to do something else.

Most people don’t realize how unusual humans are, in the way that nonreproductive females (how shall I put this?) persist. Females of most other species can bear young until they die, and many do, or at best enjoy a brief respite from breeding before death.

Read the full article.

More From The Atlantic

Open your mind to unicorn meat. The West is returning priceless African art to a single Nigerian citizen. The Democrats are now America’s conservative party. Ron DeSantis’s only hope is to beat Trump from the hard right.

Culture Break

Matt Black / Magnum

Read.Refugee Year,” a new poem by Bhion Achimba.

“In the week of power outages, / in the year of hunger, all we had was love, / its fused & infinite grammar, its wet eyes / & tenderness for days.”

Listen. Our kids will not have the childhood of our imaginations. In the latest episode of Radio Atlantic, Hanna Rosin explores how climate change is making summer more dangerous.

Play our daily crossword.

P.S.

Over the weekend, I made a six-foot-long party sub and served it to my friends in Prospect Park. One section was filled with soppressata, capicola, mortadella, and provolone; another with prosciutto and pecorino; and the final one with vegetables and hummus for my vegan pals. This was my third year making this sandwich, and I am delighted to report that inflation does not seem to have affected the price of the six-foot bread, which I purchase each year from a local Italian bakery in Brooklyn. In my biased opinion, making and eating such a sandwich is a perfect midsummer treat!

In case you’re interested in attempting your own version: I got the idea to make the sandwich in the summer of 2021 after reading Gabrielle Hamilton’s “Eat” column in The New York Times Magazine. “What other thing is as reliably cheerful as a sandwich that’s practically the size of an automobile?” she asks. What indeed! Hamilton offers great tips for a meatless version, but if you, like me, are also interested in stuffing the bread with tasty cured meats and cheeses, I recommend this Bon Appetit guide.

— Lora

Katherine Hu contributed to this newsletter.

What’s More ‘Indie Sleaze’ Than Turning 31?

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 07 › indie-sleaze-birthday-party-famous-people › 674659

This story seems to be about:

Sign up for Kaitlyn and Lizzie’s newsletter here.

Kaitlyn: As with any trend, it’s hard to tell whether the “indie sleaze” revival is real or imaginary, happening or being made to happen. There could be venture capital behind it. Or just regular capital. Last year, a photographer who was closely associated with the aesthetic back in the aughts was hired to document a fairly sleazy-looking party at a music venue in downtown indie-rock territory … hosted by, uh, Old Navy! But this newsletter isn’t about cultural criticism. It’s about going to stuff.

Recently, our friend Becca held an indie-sleaze-themed 31st-birthday party for herself in Prospect Heights, right in between Lizzie’s house and my house, which was very convenient. The invite said, “Come with the same energy as if it were 2010 and you were bringing your fake ID to a club on the LES that chloe sevigny went to once.” I didn’t relate to this, as I’m actually a little too young (brag) to have experienced the first indie-sleaze era in person. I only learned about it at the tail end on Tumblr.

Still, I was excited. I love a theme and I appreciate that Becca always provides one. I expected that she would execute it flawlessly, as I know that she went to NYU around the time of the Great Recession.

Lizzie: As far as I can remember, no one really called that whole thing anything the first time around, but now it’s been officially branded by a trend forecaster or whomever, which I guess makes it easier to talk about. Except it doesn’t. Even trying to define what “it” is feels like a losing battle, and that’s why Becca is braver than I am. Is it music? Disco shorts? 2007? 2010? Maybe, and I’m sorry, but … is it Terry Richardson?

Not saying I’m completely disconnected from the gist of whatever it refers to. The prematurely canceled and iconic public-access show New York Noise defined my high-school years, and I still sometimes wonder if Jeffrey Lewis did or did not see Will Oldham on the L train that one time.

Kaitlyn: I don’t get any of those references. Uh-oh!  

Day of the sleaze party, I was at Laundry City sorting out my whites when I had the idea that I should get Becca a copy of the 2009 Tao Lin novel, Shoplifting From American Apparel. I started my machines and then went on an hour-plus journey to five different Brooklyn bookstores, none of which had this historically significant text in stock. I need to work on my impulse control, I think. There was no reason to try five stores—the reference wasn’t even that good (alt-lit being only adjacent to indie sleaze), and a book is not a sexy gift. But once I got started, I was like, “Surely the Barnes & Noble?” I’m full of hope. After my sheepish return to Laundry City to put my long-finished laundry in the dryer, I settled for a gift the neighborhood could provide: a bouquet of Wet n Wild eyeliners and a sandwich bag of loose cigarettes.

When I got home, I changed into a skort and some tall socks and a T-shirt I got on Depop that says, in hot-pink lettering, My boyfriend is literally on stage. It was hard to wear that out on the street for the walk to Lizzie’s house in broad daylight.

Lizzie: Dressing was the most difficult part of the evening. I invited Ashley and Kaitlyn over for some sleaze-style pregaming (pizza, a Finger Lakes fizzy red, ’90s music videos) and to help me pick an outfit. The fit pickings were slim, but we landed on running shorts, a white Hanes T-shirt that I cut giant torso-revealing armholes into, and thigh-high tube socks with black stripes. Good enough if you don’t think too much about it and have no other options.

After we had covered the important topics of the week—HBO’s The Idol, corporate email-tracking systems, the superiority of Marcona almonds—Ash departed my apartment around 9, fully committed to leaving whatever indie sleaze was or is in her past. Kait and I walked our embarrassing outfits over to Becca’s and crossed our fingers that we wouldn’t see anyone on the way there except the rats.

PBR isn't bad, actually. (Courtesy of Kaitlyn Tiffany)

Kaitlyn: Everyone had a different idea of what “indie sleaze” meant. (In this way, the theme was actually memory …) Luke was wearing a Natty Light snapback. Becca was wearing a skinny scarf and dark eye glitter. There was a Kate Moss–inspired look, and someone in a bikini top had a silver iPod. A bunch of the girls drew X’s on their hands in Sharpie and did that classic MySpace-photo concept where you stick your tongue out and pretend to be lighting it on fire. Remember when Taylor Swift tried an indie-sleaze music video where she had pink hair and skinny jeans and dated a guy who hit someone in the face with a billiards ball? Nobody was dressed as that.

Lizzie: There was a Death Grips album (The Money Store) taped up on the wall. The references must have spanned at least 10 years! But they were 10 years of our somewhat overlapping youths, so no one was going to start a rumble over minutiae.

Kaitlyn: Becca led us to a big bowl of “Jungle Juice.” It tasted like Smarties and the past! We feared it. Nathan showed up a few minutes after we did with Rebecca (a second Becca) and Bayne and two six-packs of Red Stripe. The apartment was crowded and everybody was shiny-faced, dripping sweat (on-theme), so we moved over to the far side of the living room, where the air-conditioning unit was doing its best work. On a side table that was not part of the party decor, a lamp was sitting on top of a stack of books: a David Foster Wallace, a Jonathan Franzen, and my book, by me. Wow! Of course I took a bunch of photos of this, with flash (on-theme), and was pleased even if it was a joke I wasn’t getting.

“People love to be in the kitchen,” Lizzie observed. True, the beating heart of the party was the breakfast bar, which was strewn with PBR cans and half-eaten pieces of pizza—people were laughing across it and touching each other’s arms and stuff, taking selfies, etc. But where we were standing, there was room to dance.

Lizzie: So we danced! Becca’s playlist really had us wiggling. The Strokes, Azealia Banks, MGMT, Spank Rock, Daft Punk. It was the soundtrack to some period of time in the past, probably adhering most closely to the span of Bloomberg’s three terms in office. Is Mike Bloomberg indie sleaze?

Kait left the dance floor to go “mingle,” but Nathan, Rebecca, Bayne, and I continued to groove in a manner that would’ve made Gregg Gillis proud. My bottle of Red Stripe seemed to warm up to hot-tea temperature within seconds, probably because of the heat from my hand as I did electro-clash aerobic exercises next to the glass coffee table. At one point, Nathan returned to the dance floor munching on a pre-eaten piece of pizza that he’d found somewhere. He poured my Jungle Juice into Kait’s abandoned cup and started to chug. But he didn’t get far. Like, not more than a few gulps. That was probably for the best.

If all this talk of Jungle Juice and sweat and bodies is making you wonder what the room smelled like, it was actually quite nice, because Becca had a Balsam Fir Yankee Candle burning. Cory Kennedy x Christmas vibes.

The best song ever made! (Courtesy of Kaitlyn Tiffany)

Kaitlyn: What a beautiful scene. I told Luke I appreciated that his British and American passports were part of the coffee-table-scape, and that I was sorry about Brexit. Wouldn’t it be great for him if he were part of the European Union? I was also sorry about his hometown’s recent news coverage. As you may have read, the stepson of one of the passengers on the destroyed OceanGate submersible attended a Blink-182 concert while his family member was lost at sea and posted about it numerous times in a tasteless fashion. This kid, like Blink-182 and Luke, is from San Diego, and he was wearing a San Diego Padres hat in a photo he shared of himself at the show.

Luke said something inspiring about life’s ups and downs. When the Padres were beating the Dodgers in last year’s playoffs, the fans sang Blink-182’s “All the Small Things” in the pouring rain—a peak. And now, a valley. At this point, the Jungle Juice was doing its work, so, unfortunately, I started screaming about baseball. Mark Canha (a New York Met who inexplicably displays his email address in his Instagram bio) is on the cover of the new issue of The Atlantic!

Lizzie: All night, we had heard talk of a roof. Imagine how cool, literally cool, the roof would be. It was nighttime on the roof. A wide open space, high in the sky. There was probably a breeze up there! We had to see it for ourselves. Instead of taking the elevator, we ran up, like, four flights of stairs. In hindsight, I couldn’t tell you why we did this, but maybe it was to make that first surge of Mother Nature’s cool air feel even better against the skin.

Kaitlyn: There’s nothing like being on a roof after you’ve been sweating. The Manhattan skyline doesn’t get old, and neither do we. We looked at the view and had some typical roof talk—Bayne theorized that women are socially conditioned not to whistle and everyone strongly disagreed with him. After airing out our armpits, we went back downstairs to get a bit more dancing in. Becca declared “Good Girls Go Bad,” by Cobra Starship featuring Gossip Girl’s Leighton Meester, “the best song ever made,” and who could argue with her? Women are socially conditioned to be good and then go bad! [Flipping double middle fingers.]

Lizzie: We left before midnight, and Mariya said, “More like indie snooze,” which was a good burn and a fair point. But there comes a time in every sleazer’s life when the promise of mozzarella sticks and a bedtime weed gummy is more enticing than another round of hot beer and the existential feeling of time’s passage tracked by DFA Records releases. I scrunched my thigh-high tube socks back down to my ankles (disguise mode) and walked home.

Kaitlyn: Obviously, “indie sleaze” as a concept is pretty incoherent. (At the end of the “Good Girls Go Bad” video, Leighton Meester’s character is revealed to be a cop. PBR is owned by a holding company backed by a private-equity firm.) Historians still have no idea whether anything about it was meant to be ironic or if people just said that afterward because they were embarrassed.

Doesn’t matter! In Becca’s hands, a prompt is a prompt and a good excuse for a great night.

How Musk and Biden Are Changing the Media

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 07 › elon-musk-twitter-biden-journalism › 674629

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Elon Musk and Joe Biden are the unlikely tag team changing the way American journalists approach their jobs.

First, here are three new stories from The Atlantic:

The gravitational pull of supervising kids all the time There’s no such thing as an RFK Jr. voter. Everyone has “car brain.”

An Unlikely Tag Team

Reporters spend lots of time critiquing the president, so perhaps it’s only fair for Joe Biden to take a turn as a media critic.

During an interview last week with MSNBC’s Nicolle Wallace, Biden recounted a story that a reporter at “a major newspaper” told him. According to Biden, this reporter’s editor told them, “You don’t have a brand yet.”

“They said, ‘Well, I am not an editorial writer,’” Biden continued. “‘But you need a brand so people will watch you, listen to you, because of what they think you’re going to say.’ I just think there’s a lot changing.”

I’m curious from whom Biden heard this, because he speaks on the record to the press less than any president in recent memory—he’s given the fewest interviews and press conferences since Ronald Reagan. But for most reporters today, the dynamic the president is describing will be very familiar. Celebrity reporters have always existed, as Elliot Ackerman’s great recent article on the famed World War II correspondent Ernie Pyle underscored, but over the past 15 years, even cub reporters have felt intense pressure to become public personalities, whether the impetus comes from one’s editors or peers or the marketplace.

Yet as I watched Twitter melt down this weekend, I started to wonder whether that moment might actually be starting to pass—a casualty of the unlikely tag team of Joe Biden and Elon Musk. The two have, respectively, helped kill the demand and the means for journalists to brand themselves.

Donald Trump isn’t responsible for the celebrification of the press, but he supercharged it, especially in political journalism. During his presidency, the American public was more fixated on the news than it had been in decades. Journalists, in turn, became celebrities in their own right: Maggie Haberman of The New York Times became a household name thanks to her perpetual stream of Trump scoops. CNN’s Jim Acosta’s press-room grandstanding elevated his renown. The TV-retread Tucker Carlson found his moment as Trump’s greatest media apostle. Books about Trump seemed to shoot up the best-seller lists on a weekly basis.

This has all slowed to a crawl in the Biden era. The president has intentionally pursued a strategy of being boring and normal, and the result is much-reduced attention from the press. It’s hard to think of any reporter who has become a new, massive star since 2021. No Biden-book boom has ensued. Readership at news sites dropped after the 2020 election, and so have TV-news audiences. The calmer mood reverses an infamous tweet: The change is good for our country, but this is dull content.

Musk’s purchase and gradual demolition of Twitter is an even bigger part of the equation. Twitter was a branding machine that allowed reporters to make a direct connection with consumers. A clever or funny or piquant or simply hyperactive journalist could bypass the traditional gatekeepers of their outlet and become famous for something other than—or in addition to—whatever appeared under their byline.

Now Twitter is disintegrating for reasons of both ideology and technology. Although it has always been true that Twitter is not real life, the site brought together an unusually wide spectrum of the population, all in one place. Musk was mocked for calling Twitter a “town square,” but he was right. And because so many journalists were on the site, getting big on Twitter was usually enough to get big outside of it. But Musk’s takeover has encouraged the metamorphosis of the site into what my colleague Charlie Warzel has called a “far-right social network.” That drives away centrist and liberal reporters, but more importantly their audiences. Meanwhile, the site is mired in technical chaos much of the time, which is a problem for users of any political persuasion.

What comes after Twitter is a much more fragmented landscape. Many social-media sites command significant audiences, but no single platform can do what Twitter once did. A journalist can make a big bet on one platform, or they can try to hedge and be active on Reddit, YouTube, TikTok, Substack, and, as of this week, Meta’s Threads—give or take a dozen more. But who has the time? And besides, you don’t get the same reach. TikTok and YouTube command enormous but typically niche audiences. Substack grows slowly and seems to mostly reward writers who were already well-known before migrating to the platform, such as Matt Taibbi or Matt Yglesias. As Twitter refugees joined Bluesky this weekend, my following jumped by roughly 20 percent—to 221. Compare that with the nearly 34,000 followers I have on Twitter. (If I have a brand, it’s a boutique label.)

I’ve been working on reducing my own Twitter use, and I have mixed emotions. Not feeling the pressure to be part of the conversation each day has been freeing (of my time, among other things), though I miss the validation of a clever remark getting lots of engagement. I am not so naive as to hope that the era of journalist branding is over, but with a little luck, 2023 might someday look like a turning point on the road to its demise.

Related:

The White House spent four years vilifying journalists. What comes next? (From 2020) “I was an enemy of the people.”

Today’s News

A suspicious powder was found in the White House while President Biden and his family were at Camp David this past weekend, and tests confirmed it as cocaine. The world’s hottest day ever was recorded on July 3, a record that was subsequently broken again on the 4th. Yesterday, a district judge prevented Biden administration officials and certain federal agencies from working with social-media companies to discourage or filter First Amendment–protected speech.

Dispatches

The Weekly Planet: E-bikes are going to keep exploding, Caroline Mimbs Nyce explains. We’re stuck in battery purgatory. Work in Progress: Leading economists said we’d need higher unemployment to tame inflation, Adam Ozimek writes. Here’s why they were wrong.

Explore all of our newsletters here.

Evening Read

Joshua Roberts / Reuters

The Great American Eye-Exam Scam

By Yascha Mounk

On a beautiful summer day a few months ago, I walked down to the part of the Connecticut River that separates Vermont from New Hampshire, and rented a kayak. I pushed myself off the dock—and the next thing I remember is being underwater. Somehow, the kayak had capsized as it entered the river. I tried to swim up, toward the light, but found that my own boat blocked my way to safety. Doing my best not to panic, I swam down and away before finally coming up for air a few yards downriver. I clambered onto the dock, relieved to have found safety, but I was disturbed to find that the world was a blur. Could the adrenaline rush have been so strong that it had impaired my vision? No, the answer to the puzzle was far more trivial: I had been wearing glasses—glasses that were now rapidly sinking to the bottom of the Connecticut River.

If the whole experience was, in retrospect, as funny as it was scary, the most annoying consequence was the need to regain the faculty of sight. I did not have any backup glasses or spare contact lenses on hand. The local optometrists did not have open slots for an eye exam. Since the United States requires patients to have a current doctor’s prescription to buy eyewear, I was stuck. In the end, I had to wear my flowery prescription sunglasses—in offices and libraries, inside restaurants and aboard planes—for several days.

Then I went to Lima, Peru, to give a talk. There, I found a storefront optician, told a clerk my strength, and purchased a few months’ worth of contact lenses. Though my Spanish is rudimentary, the transaction took about 10 minutes.

Read the full article.

More From The Atlantic

Anohni’s message: To save the world, we’ll have to forgive ourselves. A photo appreciation of sharks

Culture Break

Cheryle St. Onge

Read.Outdoor Day,” a new poem by Nicolette Polek.

“In elementary school, my mother rides the red bus to ‘defense class.’ / Station one she crosses a brook with knotted rope.”

Listen. A collection of some of June’s most popular Atlantic articles, presented by Hark.

Play our daily crossword.

P.S.

I’m mourning the recent death of the great German free-jazz saxophonist Peter Brötzmann. The usual euphemism is that he’s an acquired taste, but unlike with, say, whiskey or coffee, most people never feel a need to acquire a taste for him. His widest exposure may have been a 2021 cutting contest with Jimmy Fallon, but back in 2001, the saxophonist and former President Bill Clinton told the Oxford American that readers would be surprised to know he was a Brötzmann fan. I emailed Clinton’s spokesperson for comment on the death, but so far I’ve received no response. (If you’re reading this, Mr. President, call me!) The truth is that not all of Brötzmann’s output is difficult listening. This 2022 live performance with the Gnawa master Majid Bekkas and the drummer Hamid Drake is even trancily soothing.

— David

Katherine Hu contributed to this newsletter.

The Hypocrisy of Mandatory Diversity Statements

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › hypocrisy-mandatory-diversity-statements › 674611

John D. Haltigan sued the University of California at Santa Cruz in May. He wants to work there as a professor of psychology. But he alleges that its hiring practices violate the First Amendment by imposing an ideological litmus test on prospective hires: To be considered, an applicant must submit a statement detailing their contributions to diversity, equity, and inclusion.

According to the lawsuit, Haltigan believes in “colorblind inclusivity,” “viewpoint diversity,” and “merit-based evaluation”—all ideas that could lead to a low-scoring statement based on the starting rubric UC Santa Cruz publishes online to help guide prospective applicants.

“To receive a high score under the terms set by the rubric,” the complaint alleges, “an applicant must express agreement with specific socio-political ideas, including the view that treating individuals differently based on their race or sex is desirable.” Thus, the lawsuit argues, Haltigan must express ideas with which he disagrees to have a chance of getting hired.

The lawsuit compares the DEI-statement requirement to Red Scare–era loyalty oaths that asked people to affirm that they were not members of the Communist Party. It calls the statements “a thinly veiled attempt to ensure dogmatic conformity throughout the university system.”

Conor Friedersdorf: The DEI industry needs to check its privilege

UC Santa Cruz’s requirement is part of a larger trend: Almost half of large colleges now include DEI criteria in tenure standards, while the American Enterprise Institute found that 19 percent of academic job postings required DEI statements, which were required more frequently at elite institutions. Still, there is significant opposition to the practice. A 2022 survey of nearly 1,500 U.S. faculty members found that 50 percent of respondents considered the statements “an ideological litmus test that violates academic freedom.” And the Academic Freedom Alliance, a group composed of faculty members with a wide range of political perspectives, argues that diversity statements erase “the distinction between academic expertise and ideological conformity” and create scenarios “inimical to fundamental values that should govern academic life.”

The Haltigan lawsuit—filed by the Pacific Legal Foundation, a right-leaning nonprofit—is the first major free-speech challenge to a public institution that requires these statements. If Haltigan prevails, state institutions may be unable to mandate diversity statements in the future, or may find themselves constrained in how they solicit or assess such statements.

“Taking a principled stand against the use of the DEI rubric in the Academy is crucial for the continued survival of our institutions of higher learning,” he declared in a Substack post earlier this year.

Alternatively, a victory for UC Santa Cruz may entrench the trend of compelling academics to submit DEI statements in institutions that are under the control of the left—and serve as a blueprint for the populist right to impose its own analogous requirements in state college systems it controls. For example, Christopher Rufo of the Manhattan Institute, who was appointed by Governor Ron DeSantis to help overhaul higher education in Florida, advocates replacing diversity, equity, and inclusion with equality, merit, and colorblindness. If California can lawfully force professors to detail their contributions to DEI, Florida can presumably force all of its professors to detail their contributions to EMC. And innovative state legislatures could create any number of new favored-concept triads to impose on professors in their states.

That outcome would balkanize state university systems into factions with competing litmus tests. Higher education as a whole would be better off if the Haltigan victory puts an end to this coercive trend.

The University of California is a fitting place for a test case on diversity statements. It imposed loyalty oaths on faculty members during the Red Scare, birthed a free-speech movement in 1964, was a litigant in the 1977 Supreme Court case that gave rise to the diversity rationale for affirmative action, and in 1996 helped inspire California voters to pass Proposition 209. That voter initiative amended the Golden State’s constitution to ban discrimination or preferential treatment on the basis of race, sex, color, ethnicity, or national origin. In 2020, at the height of the racial reckoning that followed George Floyd’s murder, voters in deep-blue California reaffirmed race neutrality by an even wider margin. This continued to block the UC system’s preferred approach, which was to increase diversity in hiring by considering, not disregarding, applicants’ race. Indeed, the insistence on nondiscrimination by California voters has long been regarded with hostility by many UC system administrators. Rewarding contributions to diversity, equity, and inclusion is partly their attempt to increase racial diversity among professors in a way that does not violate the law.

[Read: The problem with how higher education treats diversity]

The regime these administrators created is a case study in concept creep. Around 2005, the UC system began to change how it evaluated professors. As ever, they would be judged based on teaching, research, and service. But the system-wide personnel manual was updated with a novel provision: Job candidates who showed that they promoted “diversity and equal opportunity” in teaching, research, or service could get credit for doing so. Imagine a job candidate who, for example, did volunteer work mentoring high schoolers in a disadvantaged neighborhood to help prepare them for college. That would presumably benefit the state of California, the UC system by improving its applicant pool, and the teaching skills of the volunteer, who’d gain experience in what helps such students to succeed. Giving positive credit for such activities seemed sensible.

But how much credit?

A 2014 letter from the chair of the Assembly of the UC Academic Senate addressed that question, stating that faculty efforts to promote “equal opportunity and diversity” should be evaluated “on the same basis as other contributions.” They should not, however, be considered “a ‘fourth leg’ of evaluation, in addition to teaching, research, and service.”

If matters stood there, the UC approach to “diversity and equal opportunity” might not face legal challenges. But administrators successfully pushed for a more radical approach. What began as an option to highlight work that advanced “diversity and equal opportunity” morphed over time into mandatory statements on contributions to “diversity, equity, and inclusion.” The shift circa 2018 from the possibility of credit for something to a forced accounting of it was important. So was the shift from the widely shared value of equal opportunity to equity (a contested and controversial concept with no widely agreed-upon meaning) and inclusion. The bundled triad of DEI is typically justified by positing that hiring a racially and ethnically diverse faculty or admitting a diverse student body is not enough—for the institution and everyone in it to thrive, the best approach (in this telling) is to treat some groups differently than others to account for structural disadvantages they suffer and to make sure everyone feels welcome, hence “inclusion.”

That theory of how diversity works is worth taking seriously. Still, it is just a theory. I am a proponent of a diverse University of California, but I believe that its students would better thrive across identity groups in a culture of charity, forbearance, and individualism. A Marxist might regard solidarity as vital. A conservative might emphasize the importance of personal virtue, an appreciation of every institution’s imperfectability, and the assimilation of all students to a culture of rigorous truth-seeking. Many Californians of all identities believe in treating everyone equally regardless of their race or their gender.

UC Santa Cruz has not yet responded to Haltigan’s lawsuit. But its chancellor, Cynthia K. Larive, states on the UC Santa Cruz website that the institution asks for a contributions-to-DEI statement because it is “a Hispanic-Serving” and “Asian American Native American Pacific Islander-Serving Institution” that has “a high proportion of first generation students,” and that it therefore seeks to hire professors “who will contribute to promoting a diverse, equitable, and inclusive environment.” In her telling, the statements help to “assess a candidate’s skills, experience, and ability to contribute to the work they would be doing in supporting our students, staff, and faculty.”

Perhaps the most extreme developments in the UC system’s use of DEI statements are taking place on the Davis, Santa Cruz, Berkeley, and Riverside campuses, where pilot programs treat mandatory diversity statements not as one factor among many in an overall evaluation of candidates, but as a threshold test. In other words, if a group of academics applied for jobs, their DEI statements would be read and scored, and only applicants with the highest DEI statement scores would make it to the next round. The others would never be evaluated on their research, teaching, or service. This is a revolutionary change in how to evaluate professors.

This approach—one that is under direct challenge in the Haltigan lawsuit—was scrutinized in detail by Daniel M. Ortner of the Pacific Legal Foundation in an article for the Catholic University Law Review. When UC Berkeley hired for life-sciences jobs through its pilot program, Ortner reports, 679 qualified applicants were eliminated based on their DEI statements alone. “Seventy-six percent of qualified applicants were rejected without even considering their teaching skills, their publication history, their potential for academic excellence, or their ability to contribute to their field,” he wrote. “As far as the university knew, these applicants could have well been the next Albert Einstein or Jonas Salk, or they might have been outstanding and innovative educators who would make a significant difference in students’ lives.”

At UC Davis, 50 percent of applicants in some searches were disqualified based on their DEI statements alone. Abigail Thompson, then the chair of the mathematics department at UC Davis, dissented from its approach in a 2019 column for the American Mathematics Society newsletter. “Classical liberals aspire to treat every person as a unique individual,” she wrote. “Requiring candidates to believe that people should be treated differently according to their identity is indeed a political test.”

More striking than her argument was the polarized response from other academics, captured by the letters to the editor. Some wrote in agreement and some in substantive disagreement, as is appropriate. But a group letter signed by scores of mathematicians from institutions all over the United States asserted, without evidence, that the American Mathematics Society “harmed the mathematics community, particularly mathematicians from marginalized backgrounds,” merely by airing Thompson’s critique of diversity statements. “We are disappointed by the editorial decision to publish the piece,” they wrote. Mathematicians hold a diversity of views about mandatory DEI statements. But just one faction asserts that others do harm merely by expressing their viewpoint among colleagues. Just one faction openly wanted to deny such dissent a platform. Are members of that progressive faction fair when they score DEI statements that are in tension with their own political beliefs? It is not unreasonable for liberal, conservative, and centrist faculty members to be skeptical. And many are.

A rival group letter decried the “attempt to intimidate the AMS into publishing only articles that hew to a very specific point of view,” adding, “If we allow ourselves to be intimidated into avoiding discussion of how best to achieve diversity, we undermine our attempts to achieve it.”

The most formidable defender of mandatory diversity statements may be Brian Soucek, a law professor at UC Davis. He’s participated in debates organized by FIRE and the Federalist Society (organizations that tend to be more skeptical of DEI) and recently won a UC Davis Chancellor’s Achievement Award for Diversity and Community. In an April 2022 article for the UC Davis Law Review, he acknowledged that “certain types or uses of diversity statements would be indefensible from a constitutional or academic freedom standpoint” but argued that, should a university want to require diversity statements, it can do so in ways that violate neither academic freedom nor the Constitution. He has worked to make UC Davis’s approach to DEI statements more defensible.

Someone evaluating a diversity-statement regime, he suggests, should focus on the following attributes:

Are statements mandated and judged by administrators or faculty? To conserve academic freedom, Soucek believes that evaluations of professors should be left to experts in their field. Are diversity-statement prompts and rubrics tailored to specific disciplines and even job searches? In his telling, a tailored process is more likely to judge candidates based on actions or viewpoints relevant to the position they seek rather than irrelevant political considerations.    Does the prompt “leave space for contestation outside the statement”? For example, if you ask a candidate to describe their beliefs about “diversity, equity, and inclusion,” you run a greater risk of an impermissible political or ideological test than if you ask them to describe (say) what actions they have taken to help students from marginalized backgrounds to thrive. Applicants could truthfully describe relevant actions they’d taken and still dissent from the wisdom of DEI ideology without contradiction.

Soucek argues that the ability to help diverse students to thrive is directly relevant to a law professor’s core duties, not something irrelevant to legitimate educational or academic objectives. As for concerns that mandatory diversity statements might entrench orthodoxies of thought in academia, or create the perception that political forces or fear of job loss drives academic conclusions, he argues that those concerns, while real, are not unique to diversity statements—they also apply to the research and teaching statements that most job candidates must provide.

“Academic freedom, and the system of peer review that it is built upon, is a fragile business, always susceptible not just to outside interference, but also to corruption from within,” he wrote in his law-review article. But diversity statements strike me as more vulnerable to “corruption from within” than research statements. Although a hiring committee of chemists might or might not do a fair job evaluating the research of applicants, at least committee members credibly possess the expertise to render better judgments than anyone else—they know better than state legislators or DEI administrators or history professors or the public how to assess chemistry research.

[Read: What is faculty diversity worth to a university?]

On what basis can chemistry professors claim equivalent expertise in how best to advance diversity in higher education generally, or even in chemistry specifically? It wouldn’t be shocking if historians or economists or sociologists were better-positioned to understand why a demographic group was underrepresented in chemistry or how best to change that. Most hiring-committee members possess no special expertise in diversity, or equity, or inclusion. Absent empirically grounded expertise, academics are more likely to defer to what’s popular for political or careerist reasons, and even insofar as they are earnest in their judgments about which job candidates would best advance diversity, equity, or inclusion, there is no reason to afford their nonexpert opinions on the matter any more deference than the opinions of anyone else.

Ultimately, Soucek’s idealized regime of mandatory diversity statements—tailored to particular disciplines and judged by faculty members without outside political interference—strikes me as a theoretical improvement on the status quo but, in practice, unrealistic in what it presumes of hiring committees. Meanwhile, most real-world regimes of diversity statements, including those at campuses in the University of California system, lack the sort of safeguards Soucek recommends, and may not assess anything more than the ability to submit an essay that resonates with hiring committees. Whether an applicant’s high-scoring DEI statement actually correlates with better research or teaching outcomes is unclear and largely unstudied.

The costs of mandatory DEI statements are far too high to justify, especially absent evidence that they do significant good. Alas, proponents seem unaware of those costs. Yes, they know that they are imposing a requirement that many colleagues find uncomfortable. But they may be less aware of the message that higher-education institutions send to the public by demanding these statements.

Mandatory DEI statements send the message that professors should be evaluated not only on research and teaching, but on their contributions to improving society. Academics may regret validating that premise in the future, if college administrators or legislators or voters want to judge them based on how they advance a different understanding of social progress, one that departs more from their own—for example, how they’ve contributed to a war effort widely regarded as righteous.

Mandatory DEI statements send the message that it’s okay for academics to chill the speech of colleagues. If half of faculty members believe that diversity statements are ideological litmus tests, fear of failing the test will chill free expression within a large cohort, even if they are wrong. Shouldn’t that alone make the half of academics who support these statements rethink their stance?

Mandatory DEI statements send a message that is anti-pluralistic. I believe that diversity and inclusion are good. I do not think that universities should reward advancing those particular values more than all others. Some aspiring professors are well suited to advancing diversity. Great! The time of others is better spent mitigating climate change, or serving as expert witnesses in trials, or pioneering new treatments for cancer. Insofar as all academics must check a compulsory “advancing DEI” box, many will waste time on work that provides little or no benefit instead of doing kinds of work where they enjoy a comparative advantage in improving the world.

And mandatory DEI statements send the message that viewpoint diversity and dissent are neither valuable nor necessary—that if you’ve identified the right values, a monoculture in support of them is preferable. The scoring rubric for evaluating candidates’ statements that UC Santa Cruz published declares that a superlative statement “discusses diversity, equity, and inclusion as core values of the University that every faculty member should actively contribute to advancing.” Do academics really want to assert that any value should be held by “every” faculty member? Academics who value DEI work should want smart critics of the approach commenting from inside academic institutions to point out flaws and shortcomings that boosters miss.

Demanding that everyone get on board and embrace the same values and social-justice priorities will inevitably narrow the sort of people who apply to work and get hired in higher education.

In that sense, mandatory DEI statements are profoundly anti-diversity. And that strikes me as an especially perilous hypocrisy for academics to indulge at a time of falling popular support for higher education. A society can afford its college professors radical freedom to dissent from social orthodoxies or it can demand conformity, but not both. Academic-freedom advocates can credibly argue that scholars must be free to criticize or even to denigrate God, the nuclear family, America, motherhood, capitalism, Christianity, John Wayne movies, Thanksgiving Day, the military, the police, beer, penetrative sex, and the internal combustion engine—but not if academics are effectively prohibited from criticizing progressivism’s sacred values.

The UC system could advance diversity in research and teaching in lots of uncontroversial ways. Instead, in the name of diversity, the hiring process is being loaded in favor of professors who subscribe to the particular ideology of DEI partisans as if every good hire would see things as they do. I do not want California voters to strip the UC system of more of its ability to self-govern, but if this hypocrisy inspires a reformist ballot initiative, administrators will deserve it, regardless of what the judiciary decides about whether they are violating the First Amendment.

Hip-Hop’s Midlife Slump

The Atlantic

www.theatlantic.com › ideas › archive › 2023 › 07 › hip-hop-mainstream-evolution-puff-daddy-hamptons-white-party › 674575

This story seems to be about:

In the summer of 1998, the line to get into Mecca on a Sunday night might stretch from the entrance to the Tunnel nightclub on Manhattan’s 12th Avenue all the way to the end of the block; hundreds of bodies, clothed and barely clothed in Versace and DKNY and Polo Sport, vibrating with anticipation. Passing cars with their booming stereos, either scoping out the scene or hunting for parking, offered a preview of what was inside: the sounds of Jay-Z and Busta Rhymes and Lil’ Kim. These people weren’t waiting just to listen to music. They were there to be part of it. To be in the room where Biggie Smalls and Mary J. Blige had performed. To be on the dance floor when Funkmaster Flex dropped a bomb on the next summer anthem. They were waiting to be at the center of hip-hop.

What they didn’t realize was that the center of hip-hop had shifted. Relocated not just to another club or another borough, but to a beachfront estate in East Hampton. Although Sundays at the Tunnel would endure for a few more years, nothing in hip-hop, or American culture, would ever be quite the same again.

It’s been 25 years since Sean Combs, then known as Puff Daddy, hosted the first of what would become his annual White Party at his home in the Hamptons. The house was all white and so was the dress code: not a cream frock or beige stripe to be seen. Against the cultural landscape of late-’90s America, the simple fact of a Black music executive coming to the predominantly white Hamptons was presented as a spectacle. That summer, The New York Times reported, “the Harlem-born rap producer and performer had played host at the Bridgehampton polo matches, looking dapper in a seersucker suit and straw boater. The polo-playing swells had invited him and he had agreed, as long as the day could be a benefit for Daddy’s House, a foundation he runs that supports inner-city children.”

To be clear, hip-hop was already a global phenomenon whose booming sales were achieved through crossover appeal to white consumers. Plenty of them were out buying Dr. Dre and Nas CDs. Combs was well known to hip-hop aficionados as an ambitious music mogul—his story of going from a Howard University dropout turned wunderkind intern at Uptown Records to a mega-successful A&R executive there was the kind of thing that made you wonder why you were paying tuition. But to those young white Americans, in 1998, he was just the newest rap sensation to ascend the pop charts. When Combs’s single “Can’t Nobody Hold Me Down” hit No. 1 on the Billboard Hot 100 the year before, it was only the tenth rap track to do so. The genre was still viewed as subversive—“Black music” or “urban music,” music that was made not for the polo-playing swells, but for the inner-city children whom their charity matches benefited.

Hip-hop was born at a birthday party in the Bronx, a neglected part of a neglected city. The music and culture that emerged were shaped by the unique mix of Black and Puerto Rican people pushed, together, to the margins of society. It was our music. I was a Nuyorican girl in Brooklyn in the ’80s and ’90s; hip-hop soundtracked my life. If Casey Kasem was the voice of America, on my radio, Angie Martinez was the voice of New York.

When I went to college in Providence, I realized all that I’d taken for granted. There was no Hot 97 to tune into. There were no car stereos blasting anything, much less the latest Mobb Deep. Hip-hop became a care package or a phone call to your best friend from home: a way to transcend time and space. It also became a way for the few students of color to create community.

You could find us, every Thursday, at Funk Night, dancing to Foxy Brown or Big Pun. Sundays, when the school’s alternative-rock station turned the airways over to what the industry termed “Black music” were a day of revelry. Kids who came back from a trip to New York with bootleg hip-hop mixtapes from Canal Street or off-the-radio recordings from Stretch Armstrong and Bobbito Garcia’s underground show were lauded like pirates returning home with a bounty. We knew that hip-hop was many things, but not static. We understood that it was going to evolve. What we weren’t perhaps ready for was for it to go truly mainstream—to belong to everyone.  

The media were quick to anoint Combs a “modern-day Gatsby,” a moniker Combs himself seems to have relished. “Have I read The Great Gatsby?” he said to a reporter in 2001. “I am the Great Gatsby.” It’s an obvious comparison—men of new money and sketchy pasts hosting their way into Long Island polite society—but a lazy one. Fitzgerald’s character used wealth to prove that he could fit into the old-money world. Combs’s White Party showcased his world; he invited his guests to step into his universe and play on his terms. And, in doing so, he shifted the larger culture.

Would frat boys ever have rapped along to Kanye West without the White Party? Would tech bros have bought $1,000 bottles of $40 liquor and drunkenly belted out the lyrics to “Empire State of Mind”? Would Drake have headlined worldwide tours? Would midwestern housewives be posting TikToks of themselves disinfecting countertops to Cardi B songs? It’s hard to imagine that a single party (featuring a Mister Softee truck) could redefine who gets to be a bona fide global pop star but, by all accounts, Puffy was no ordinary host.

Mel D. Cole

The man had a vision. “I wanted to strip away everyone’s image,” Combs told Oprah Winfrey years after the first White Party, “and put us all in the same color, and on the same level.” That the level chosen was a playground for the white and wealthy was no accident. Upon closing a merger of his Bad Boy record label with BMG for a reported $40 million in 1998, he told Newsweek, “I’m trying to go where no young Black man has gone before.”

“It was about being a part of the movement that was a new lifestyle behind hip-hop,” Cheryl Fox told me. Now a photographer, she worked for Puffy’s publicist at the time of the first White Party. The Hamptons, the all-white attire: It was Puffy’s idea. But the white people, she said, were a publicity strategy. “He was doing clubs, and he was doing parties that did not have white people,” she told me. “I brought the worlds together, and then I was like, ‘You got to step out of the music. You can’t just do everything music.’” She meant that he should expand the guest list to include actors and designers and financiers—the kinds of people who were already flocking to the Hamptons.

In the end, “​​I had the craziest mix,” Combs told Oprah. “Some of my boys from Harlem; Leonardo DiCaprio, after he’d just finished Titanic. I had socialites there and relatives from down south.” Paris Hilton was there. Martha Stewart was there. “People wanted to be down with Puff,” Gwen Niles, a Bad Boy rep at the time, told me about that first party. “People were curious: Who is this rap guy?

Hip-hop was already popular. The message the party sent was that hip-hop, and the people who made it, were also “safe.”

Rap music was for so long cast by white media as dangerous, the sonic embodiment of lawlessness and violence. This narrative was so sticky that it kept hip-hop confined to the margins of pop culture despite its commercial success.

Hip-hop didn’t always help itself out here. Artists screwed up in the ways artists in all genres do—with drug addictions, outbursts, arrests—but when it came to hip-hop, those transgressions were used to reinforce cultural stereotypes. Misogyny had been embedded in the lyrics of hip-hop nearly since its inception. A heartbreaking 2005 feature by Elizabeth Méndez Berry in Vibe exposed the real-world violence inflicted upon women by some of hip-hop’s most beloved artists, including Biggie Smalls and Big Pun. Homophobia in hip-hop perpetuated anti-queer attitudes, particularly in communities of color. And although lyrical battles have always been a thing, rhetorical fights never needed to become deadly physical ones.

This was the context in which Puffy headed to the Hamptons. Though only 28, he had baggage. While a young executive at Uptown in 1991, he had organized a celebrity basketball game at CUNY’s City College to raise money for AIDS charities. Tickets were oversold, and a stampede left nine people dead and many more injured. The tragedy stayed in the headlines for weeks. (Years later, Puffy would settle civil suits with victims.)

In 1993, Combs launched Bad Boy Records, with a roster of stars such as Biggie. The label met with immediate success, but also controversy, after a shooting involving the California rapper Tupac Shakur embroiled Bad Boy in a contentious battle between East and West. By the spring of 1997, Biggie and Tupac were dead—Biggie gunned down in Los Angeles in what appeared to be retribution for the killing of Tupac the year before. Biggie was shot while stopped at a red light; Combs was in another car in the entourage. (Neither murder has been solved.) That fall, Combs performed “I’ll Be Missing You,” his tribute to Biggie, live at MTV’s Video Music Awards. With a choir in the rafters, Combs danced through his grief. It was a moment of rebirth, of reinvention. Combs and the gospel singers wore white.

To be clear, most of what Puffy was making as an artist and producer in this era was accessible to a white, affluent fan base. These were the kind of tracks that sampled songs your parents would have danced to, spliced and sped up so that you wanted to dance to them now. Outside of “I’ll Be Missing You” and a few songs about heartbreak, many of the lyrics were about getting, having, and spending money.

But Puffy made possible the crossover explosion of more substantial artists such as Lauryn Hill and OutKast and Jay-Z, the first generation of hip-hop superstars.

You could also say that Puffy took a musical neighborhood—one that held history and heritage and layers of meaning—and gentrified it. Cleaned it up for whiter, wealthier patrons to enjoy, people who had no idea of what the “old ’hood” was about. Both things can be true.

The summer of 1998 was also the summer before my last year of college. Up in Providence, a local copycat to Hot 97 had cropped up and gained traction: WWKX, Hot 106, “the Rhythm of Southern New England.” Seemingly overnight, the frat houses added DMX to their rotation. A classmate—a white socialite from the Upper East Side—came back senior year with box braids describing herself as a real “hip-hop head.” Funk Night became a campus-wide phenomenon, and then it ceased to exist. Nobody needed a hip-hop night when every night was hip-hop night.

In rap, the feeling was “I’m keeping it real. I’m gonna stay on this block,” Jay-Z recounts of this era in the Bad Boy documentary, Can’t Stop, Won’t Stop. “And our feeling was like, Yeah? I’ll see you when I get back.” Emotions around this ran hot at the time—the idea that hip-hop had left its true fans behind. But in the end, more of us were happy to see hip-hop conquer the world than were grouching in the corner about the good ol’ days.

In 2009, Puffy, by then known as Diddy, relocated his White Party to Los Angeles; hip-hop’s new mecca was the land of celebrity. The vibe, according to people who were there, just wasn’t the same. But hip-hop itself was moving on to bigger and bigger arenas. In 2018, hip-hop dominated streaming, and accounted for more than 24 percent of record sales that year. That same year, Eminem headlined Coachella, Drake dominated the Billboard 100 for months, and Kendrick Lamar won a Pulitzer Prize.

Then something shifted again. This year isn’t just the 25th anniversary of the first White Party. It’s the 50th anniversary of hip-hop itself. And although it’s come a long way since Kool Herc deejayed a Bronx basement dance party, the genre appears to be suffering a midlife slump.

For the first time in three decades, no hip-hop single has hit No. 1 yet this year. Record sales are down. According to one senior music executive I spoke with, who asked to remain anonymous because she wasn’t authorized to speak, festivals have been reluctant to book rappers as headliners since 2021. That’s the year that eight people were crushed to death at the Astroworld Festival in Houston; two more died later of their injuries. The performer Travis Scott was accused (fairly or unfairly) of riling up the crowd. (Coachella hasn’t had a true hip-hop headliner since Eminem.)

But the other question is: Which headliners would they even book? Kendrick Lamar is winding down his 2022 tour. Nicki Minaj doesn’t have a new album coming out until the fall. Staple acts such as J. Cole probably won’t release an album this year at all. Megan Thee Stallion, who got shot a few years ago and has been feeling burned out by the industry, is taking a break from music. As the legendary artists Too $hort and E-40 wrote in this magazine, since 2018, hip-hop has seen at least one rapper’s life a year ended by violence. The careers of Gunna and Young Thug—two major acts on the rise—have stalled while they’ve been caught up in RICO charges in Atlanta. (Perhaps sensing an opportunity, Drake just announced that a new album and tour would be coming soon.)

Recently, The New York Times ran an article about how the Hamptons have lost their cool. Too affluent. Too old. Too out of touch. Maybe hip-hop, for the first time, is suffering from similar doldrums. But obituaries to the genre have been written before. It’s only a matter of time before a new Gatsby shows up, ready to throw a party.

The Best Background-Noise TV

The Atlantic

www.theatlantic.com › newsletters › archive › 2023 › 07 › background-noise-tv-recommendation-inkmaster-zelda › 674594

This story seems to be about:

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Welcome back to The Daily’s Sunday culture edition, in which one Atlantic writer reveals what’s keeping them entertained.

Today’s special guest is Atlantic contributing writer Ian Bogost, who is also the director of the film-and-media-studies program at Washington University in St. Louis. He’s recently written about how the first year of AI college ended in ruin, and whether Elon Musk and Mark Zuckerberg are jocks or nerds.

Ian is currently struggling to get into a new video game his friends love, learning how to tattoo (sort of) with the help of a reality-TV show, and relishing the complexity of the kids’ show Bluey.

First, here are three Sunday reads from The Atlantic:

Race neutral” is the new “separate but equal.Stop firing your friends. The comic strip that explains the evolution of American parenting

The Culture Survey: Ian Bogost

The entertainment product my friends are talking about most right now: I run in video-game-design circles, and the biggest recent release in games is The Legend of Zelda: Tears of the Kingdom. This title has two features that really light gamers up: First, it’s a new Zelda game by Nintendo, and that franchise is 37 years old and hugely popular, which makes a lot of people very happy. Second, the new game is absolutely massive, and the player can do all manner of things in it, including constructing elixirs from raw ingredients and fabricating machinery and vehicles.

Unfortunately, the only tears shed in my kingdom are those of boredom. I used to love Zelda, but I just can’t get into these games anymore. For one part, it’s because there’s so much lore to keep track of—the creators have done fantasy-narrative somersaults to keep justifying new titles. But for another part, the in-game creativity that so many players seem to love leaves me cold. I find it remarkable when people make huge carnival-wheel vehicles to traverse seemingly impassible geology or dog-petting machines to attempt to endear themselves to the in-game pooches. But hell if I want to do this myself.

I think it’s because my work demands creative production. I have to be—I get to be!—creative in my job(s). But that means I absolutely do not want to be creative for my leisure. [Related: Coming of age with The Legend of Zelda]

The television show I’m most enjoying right now: Television used to be different from cinema. It was more ambient, taken in along with breakfast or while vacuuming, pursued as a ritual activity more than a narrative one. I miss that. When we get exhausted by high-quality scripted shows, my wife and I turn to a season of Ink Master, a tattooing-competition show.

This show has been around on various networks since 2012, but I’d never watched it until a couple of years ago. All 14 seasons stream on Paramount Plus. I love reality television, and anyone who claims not to is lying or deluded. But I find special affinity with the shows about creative practice. I don’t want to craft things in video games, but I love watching people perform a craft, especially one I’m not familiar with or adept in.

Lots of shows in this genre are popping up these days. The Great British Baking Show is great but has become a little too wholesome, to the point of being cloying; The Great Pottery Throw Down is a touch too emotionally overwrought for its decidedly mid subject, ceramics; Blown Away, a glassblowing show, is a bit too fine-arts cosmic for dumb television; Forged in Fire (bladesmithing—everything has a reality-competition show) is overly edgelord-creeptastic for me. Ink Master strikes a good balance.

The big problem with these shows is that they never really explain anything. They’ll introduce you to terms of art, but not to technique or style. I guess the producers feel that that would be boring for most viewers—better to court drama between competitors instead. No need for that, though; it’s why we have Selling Sunset. [Related: The Great British Baking Show’s technical challenges are a scourge.]

A quiet song that I love, and a loud song that I love: The quiet song is hard, and I think I know why: Today, people do a lot of ambient listening—headphones while working or studying, whole-house audio in the evenings, on a portable speaker on the deck or by the pool. Brian Eno had to coin the term ambient music because the concept of listening to enhance an environmental situation wasn’t codified, despite precedents. Now, thanks to streaming-music services and their playlists, it’s super easy to find enhancements to any mood or vibe. But that also means that individual songs become de-emphasized, for better and worse. My pick for a quiet song is really a pick for a quiet playlist: The Synthwave—Night Drive playlist on Spotify. Put this on in the car next time you need to run to Target or CVS after dark, and it will turn your errand into a moody 1980s vaporwave antihero affair.

The loud song is easier: It’s definitely Metallica, probably “Battery” but maybe “Master of Puppets.” Metallica has enjoyed a bit of a pop-culture revival in recent years, with notable features in shows such as Stranger Things and Billions. But those mainstream resurrections make it easy to forget just how fringe heavy-metal music was in its heyday. If you listened to Metallica or Megadeth or Queensrÿche in the 1980s or early ’90s, you were socially ostracized for it. This was not a polite or accepted thing to do. Glam metal (like Poison) and hard rock (like Guns N’ Roses) somewhat tamed that sentiment, but they did so at a cost—a lost edge. I can’t believe I’m calling Guns N’ Roses more palatable, but isn’t that the truth? It’s revisionist to pretend that heavy metal was just a normal, mainstream thing. I guess it’s good that it became so, but it’s also a little sad to forget the forces that pushed people to enjoy it at the time. [Related: Five lessons in creativity from Metallica]

Something delightful introduced to me by a kid in my life: It’s definitely Bluey, an animated series from Australia about a family of anthropomorphized heeler dogs and their dog friends. The titular Bluey is a blue-heeler girl, and the show follows her antics along with those of her younger sister, Bingo (red heeler), and their parents, Bandit and Chilli.

The show is both charming and problematic, and maybe that’s what makes it such a draw. Bandit can exemplify the best kind of fatherhood, but he can also be kind of an asshole (like when he doesn’t tell Bingo he’s leaving the country for six weeks? And leaving tomorrow?). Bluey is creative but also a bit of a hellion who gets her way even when she doesn’t deserve it, and Bingo is existentially bereft and tragically misunderstood by her parents and sister. It’s refreshing to see such layers of honesty and complexity in a show for very young children, who lead lives far knottier and more layered than adults give them credit for.

A poem, or line of poetry, that I return to: A fragment by the seventh-century-B.C.E. Greek lyric poet Archilochus. Here it is:

εἰμὶ δ’ ἐγὼ θεράπων μὲν Ἐνυαλίοιο ἄνακτος

καὶ Μουσέων ἐρατὸν δῶρον ἐπιστάμενος.

And thank you for giving me a reason to exercise my comparative-literature doctorate by offering this brand-new, translated-just-for–The Atlantic rendition:

I am war’s wingman

And art’s willing puppet.

Here’s a more typical, literal take:

I am a servant of lord Ares,

and of the Muses, familiar with their lovely gift.

That’s all that history preserved of this poem. We don’t know if there was more of it. That’s why classicists call it a fragment.

Some of them have read these lines as striking in their paradox, others as utterly normal—war and poetry were complements for the ancients. Whatever the case, these two lines are burned into my brain for some reason. I think in part because Archilochus was easy and fun to read in Greek, unlike the Homeric epics from a century or so before our man Archie here. But also because here’s this dude from almost 2,700 years ago who feels so contemporary: the mercenary with a soft side, scribbling lines like these about reality and expectation, and others about getting drunk enough to fight, because how else would you find the will to bother? Very relatable. People just aren’t so different now than they ever were, or ever will be.

The Week Ahead

Owner of a Lonely Heart, a memoir by Beth Nguyen that explores the author’s escape from Saigon at the end of the Vietnam War—and the mother she left behind (on sale Monday) Joy Ride, starring Stephanie Hsu and Ashley Park, a raunchy comedy of self-discovery set against a business trip to Asia (in theaters Wednesday) Kizazi Moto: Generation Fire, a pan-African sci-fi animated series executive-produced by Peter Ramsey of Spider-Man: Into the Spider-Verse (debuts on Disney+ this Wednesday)

Essay

Jen Rosenstein

Dave Grohl’s Monument to Mortality

By Jeffrey Goldberg

Twenty-nine years ago, Dave Grohl, then the drummer for Nirvana, lost his singer, the band’s brilliant and vexed leader, Kurt Cobain. Last year, Grohl, now the leader of Foo Fighters, lost his drummer, the dazzling Taylor Hawkins. And then, a few months later, Grohl’s mother, Virginia, died. She was, among other things, the ne plus ultra of rock moms, a teacher by profession whose support for her charismatic, punk-loving, unscholarly (her gentle word) son was unfaltering and absolute.

One blow, then another. It was all a bit much. Grohl is an unreasonably buoyant person, but it was hard to imagine how he would pull himself out of a trough dug by such concentrated loss.

But he did. And he did so by writing his way out.

Read the full article.

More in Culture

Olivia Rodrigo’s big, bloody return The juicy secrets of everyday life It’s hard to be mad at Indiana Jones. Can movie fans ever have a nice thing? Short story: “The Posting” Sara Freeman on how marriages implode A strikingly honest reality show about sex, money, and health Thank goodness for Jennifer Lawrence’s R-rated rom-com.

Catch up on The Atlantic

Russia has reached a dead end. Scientists found ripples in space and time. And you have to buy groceries. America’s most popular drug has a puzzling side effect. We finally know why.

Photo Album

A person walks through part of the exhibition “You, Me, and the Balloons,” by Yayoi Kusama, at Aviva Studios, in Manchester, England. (Christopher Furlong / Getty)

An Eid al-Adha festival in India, protests in France, and more in our editor’s selection of the week’s best photos.

Katherine Hu contributed to this newsletter.