Itemoids

Instagram

Everyone’s Over Instagram

The Atlantic

www.theatlantic.com › technology › archive › 2022 › 11 › instagram-tiktok-twitter-social-media-competition › 672305

Earlier this fall, while riding the subway, I overheard two friends doing some reconnaissance ahead of a party. They were young and cool—intimidatingly so, dressed in the requisite New York all black, with a dash of Y2K revival—and trying to figure out how to find a mutual acquaintance online.

“Does she have Instagram?” one asked, before adding with a laugh: “Does anybody?”

“I don’t even have it on my phone anymore,” the other confessed.  

Even just a couple of years ago, it would have been unheard-of for these 20-something New Yorkers to shrug off Instagram—a sanctimonious lifestyle choice people would have regretted starting a conversation about at that party they were headed to. But now it’s not so surprising at all. To scroll through Instagram today is to parse a series of sponsored posts from brands, recommended Reels from people you don’t follow, and the occasional picture from a friend that’s finally surfaced after being posted several days ago. It’s not what it used to be.

“Gen Z’s relationship with Instagram is much like millennials’ relationship with Facebook: Begrudgingly necessary,” Casey Lewis, a youth-culture consultant who writes the youth-culture newsletter After School, told me over email. “They don’t want to be on it, but they feel it’s weird if they’re not.” In fact, a recent Piper Sandler survey found that, of 14,500 teens surveyed across 47 states, only 20 percent named Instagram their favorite social-media platform (TikTok came first, followed by Snapchat).

Simply being on Instagram is a very different thing from actively engaging with it. Participating means throwing pictures into a void, which is why it’s become kind of cringe. To do so earnestly suggests a blithe unawareness of your surroundings, like shouting into the phone in public.  

In other words, Instagram is giving us the ick: that feeling when a romantic partner or crush does something small but noticeable—like wearing a fedora—that immediately turns you off forever.

“People who aren’t influencers only use [Instagram] to watch other people make big announcements,” Lee Tilghman, a former full-time Instagram influencer, told me over the phone. “My close friends who aren’t influencers, they haven’t posted in, like, two years.”

As is always the case, the ick came about quite suddenly—things were going great for Instagram, until they just weren’t. In 2014, the app hit 300 million monthly active users, surpassing Twitter for the first time. The Instagram Stories feature, a direct rip-off of Snapchat, was introduced in August 2016 and outpaced the original just one year later. But although Instagram now has 2 billion monthly users, it faces an existential problem: What happens when the 18-to-29-year-olds who are most likely to use the app, at least in America, age out or go elsewhere? Last year, The New York Times reported that Instagram was privately worried about attracting and retaining the new young users that would sustain its long-term growth—not to mention whose growing shopping potential is catnip to advertisers. TikTok is already more popular among young American teens. Plus, a series of algorithm changes—and some questionable attempts to copy features from other apps—have disenchanted many of the users who are sticking around.

Over the summer, these frustrations boiled over. An update that promised, among other things, algorithmically recommended video content that would fill the entire screen was a bridge too far. Users were fed up with watching the app contort itself into a TikTok copycat that prioritized video and recommended posts over photos from friends. Even celebrities such as Kylie Jenner and Chrissy Teigen spoke up.

“Make Instagram Instagram Again” read a graphic, created by the photographer Tati Bruening, that was shared by Jenner on Instagram Stories and liked by more than 2 million users.

“It’s not just that I suck at making videos,” Teigen wrote on Twitter in a back-and-forth with Instagram head, Adam Mosseri. “It’s that I don’t see my actual friend’s posts and they don’t see mine.”

Instagram ultimately walked back some of its more controversial changes—those screen takeovers, for one—but the remaining features that were meant to bolster the platform’s growth may not be paying off. Internal documents obtained by The Wall Street Journal show that Instagram users spend 17.6 million hours a day watching Reels, Instagram’s TikTok knockoff, compared with the 197.8 million hours people spend watching TikTok every day. The documents also revealed that Reels engagement has declined by 13.6 percent in recent months, with most users generating “no engagement whatsoever.” When reached for comment, a spokesperson for Instagram said this report referred to a “a moment-in-time snapshot blown out of proportion.” They pointed to Meta’s recent earnings call, where CEO Mark Zuckerberg shared that Reels plays have seen 50 percent growth in the past six months.

Instagram may not be on its deathbed, but its transformation from cool to cringe is a sea change in the social-media universe. The platform was perhaps the most significant among an old generation of popular apps that embodied the original purpose of social media: to connect online with friends and family. Its decline is about not just a loss of relevance, but a capitulation to a new era of “performance” media, in which we create online primarily to reach people we don’t know instead of the people we do. That has broader implications for Instagram’s most significant by-product: influencers.

[Read: The age of social media is ending]

People have found ways to get paid for their content online since long before Instagram existed. But the app certainly led to an aesthetic shift, toward pink background walls and flat-lay photography, and facilitated the rise of the modern content creator. Lavish brand deals, in which an influencer promotes a brand’s product to their audience for a fee, have been known to pay anywhere from $100 to $10,000 per post, depending on the size of the creator’s following and their engagement. Now Tilghman, who became an Instagram influencer in 2015 and at one point had close to 400,000 followers, says she’s seen her rate go down by 80 percent over the past five years. The market’s just oversaturated.

In lieu of Instagram, Tilghman turned to Substack, where she writes the paid publication Pet Hair on Everything. She still posts on Instagram, but now mostly as a way to redirect her 246,000 followers to her writing. The author Jessica DeFino, who joined Instagram in 2018 on the advice of publishing agents, similarly began stepping back from the platform in 2020, feeling overwhelmed by the constant feedback of her following. She has now set up auto-replies to her Instagram DMs: If one of her 59,000 followers sends her a message, they’re met with an invitation to instead reach out to DeFino via email.

Of course, these are bad times for many social-media platforms. Facebook and Snap are struggling, too, to say nothing of Twitter. “At least historically, all social media platforms eventually become irrelevant and obsolete, but I’m optimistic that it won’t always be the case,” Lewis said. “I don’t know that Instagram has what it takes … to maintain relevancy as long as, like, email, but I do think a social media platform could pull this off.”

Transformation is natural for social platforms (just look at Tumblr). Instagram’s fading fortunes might mean not the end of the app, but rather a reappraisal of our relationship to it. LaTonya Yvette, a lifestyle blogger who has been on Instagram for close to 12 years, says these changes have always been part of the deal, and that Instagram’s benefits to her career over the years far outnumber the frustrations.

“I’ve always looked at [Instagram] as an extension of my storytelling,” she told me over email. “Because ultimately it should be … a tool in someone’s artistic, social, political and/or business toolbox, not the only avenue.”

DeFino’s social-media audience is how she was found by an editor. She predicts that she’ll return to her Instagram platform to promote her upcoming book this spring.

But would she get back on Instagram as a regular user? Only if she “created a private, personal account — somewhere I could limit my interactions to just family and friends,” she says. “Like what Instagram was in the beginning, I guess.”

That is if, by then, Instagram’s algorithm-driven, recommendation-fueled, shopping-heavy interface would even let her. Ick.

How Should We Deal With High-Profile Anti-Semites?

The Atlantic

www.theatlantic.com › newsletters › archive › 2022 › 11 › how-should-we-deal-with-high-profile-anti-semites › 672306

This story seems to be about:

This is an edition of Up for Debate, a newsletter by Conor Friedersdorf. On Wednesdays, he rounds up timely conversations and solicits reader responses to one thought-provoking question. Later, he publishes some thoughtful replies. Sign up for the newsletter here.

Question of the Week

What is the best response to anti-Semitism in America?

Send your responses to conor@theatlantic.com or simply reply to this email.

Conversations of Note

Although I believe we’re living through a period of overzealous speech policing, there are still a few questions I regard as settled, a few associated speech taboos I value, and occasional instances when I believe that a public figure has gone beyond the pale––Roseanne Barr, Ralph Northam, Rush Limbaugh––and that some sort of counterspeech is necessary and desirable.

“For most of my adult life, antisemites—with exceptions like Pat Buchanan and Mel Gibson—have lacked status in America,” Michelle Goldberg writes in her most recent column for The New York Times. “The most virulent antisemites tended to hate Jews from below, blaming them for their own failures and disappointments.” But now, she laments, “anti-Jewish bigotry, or at least tacit approval of anti-Jewish bigotry, is coming from people with serious power,” arguably including a former president.

As Goldberg put it:

There is no excuse for being shocked by anything that Donald Trump does, yet I confess to being astonished that the former president dined last week with one of the country’s most influential white supremacists, a smirking little fascist named Nick Fuentes. There’s nothing new about antisemites in Trump’s circle, but they usually try to maintain some plausible deniability, ranting about globalists and George Soros rather than the Jews.

Fuentes, by contrast, is overt. “Jews have too much power in our society,” he recently wrote on his Telegram channel. “Christians should have all the power, everyone else very little.” Fuentes was brought to Trump’s lair by Ye, the rapper formerly known as Kanye West, who was evidently serious when he threatened to go “death con 3” on the Jews last month.

Since the publication of Ye’s anti-Semitic tweet, and his subsequent suspensions from Twitter and Instagram, I’ve been pondering what the best response would have been, as someone who values the taboo against anti-Semitism but doesn’t know how best to conserve it. I sympathize with those who believe that loudly denouncing anti-Semitic comments is a moral imperative, particularly when they come from a famous person whose creative work has been celebrated for artistic excellence, which confers a measure of influence. Then again, I’d hate to render public discourse captive to the most idiotic ravings of a provocateur who thrives on attention, so I also get the impulse to ignore rather than focus on West’s words.

[Read: The gift of civil discussion]

Or consider Kyrie Irving, the Brooklyn Nets star who posted a link on social media to the anti-Semitic film Hebrews to Negroes: Wake Up Black America! As fallout began, Irving declared, “I’m not going to stand down on anything I believe in. I’m only going to get stronger because I’m not alone. I have a whole army around me.” Then the Nets suspended him, and Nike announced that it would suspend its relationship with Irving. On his Substack, the sports journalist Ethan Strauss provided additional context:

Kyrie lobbied for his peers to shut down the 2020 NBA playoff bubble and potentially squander billions for the cause of social justice. He faced some criticism over this, but then some praise for prescience after players actually did go on a wildcat strike during the bubble.

Later on, more controversially, Irving refused to get a Covid vaccine, despite New York City’s quite onerous vax mandate. He was criticized rather ruthlessly for his choice in the media. Respectable outlets didn’t ask many questions about civil liberties concerns and those who had praised Irving’s past outspokeness [sic] were all too happy to suddenly dismiss him as a whack job. But New York has since ditched its mandates and history might view Kyrie’s move more favorably. Increasingly, it’s more accepted to say that young and healthy people shouldn’t be forced to adjust their immune systems to the whims of mayoral decree.

This is prologue for yet another situation where Irving is up against a consensus that he’s being insane … I do believe he’s being insane and also that he’s not a rational actor … His opinion generation process, according to those who’ve worked with him, is scrolling through hours and hours of Instagram videos absent much discernment. Irving quite literally wasn’t convinced the world was round. This wasn’t a put-on or a troll, but a genuine opinion according [to] those who know him. I’m noting the brief history to establish why it might be difficult to move Irving off a position, crazy as that position might be, even if Nike is cutting ties with him.

A House of Strauss reader pointed out that, back in 2001, then–New York Knicks player Charlie Ward made headlines after saying Jews had blood on their hands for killing Jesus and were stubborn. Rather than suspend the player, then–NBA Commissioner David Stern issued the following statement:

Ward would have been better off not to have uttered his uninformed and ill-founded statements. But I do not wish to enhance his sense of martyrdom by penalizing him for giving them public voice. He will have to accept the reactions and judgments of fans and all fair-minded people who have been offended.

Was Stern’s course prudent because it emphasized the wrongheadedness of Ward’s statement without making a free-speech martyr of a well-known athlete who was seemingly engaged in anti-Semitism? Or was there a better way? It’s hard to know for sure, but here’s a New York Post account of what happened after Irving’s suspension ended and he returned to the NBA:

Hundreds of members of a Black Jewish Israelite group chanted “we are the real Jews” as they descended on Brooklyn’s Barclays Center during pro-Kyrie Irving marches, videos show. A massive line of followers, all donning shirts of the group “Israel United in Christ,” were captured Sunday bellowing “we’ve got some good news” and “we are the real Jews,” according to footage posted on social media, which has since gone viral.

I’m unsure as to whether that could have been avoided, or whether it matters, and if so, how much it matters. And I was interested to see Kareem Abdul-Jabbar wrestling with similar questions in a conversation with Bari Weiss, who asked the NBA’s all-time scoring leader, “What do you think is the right response to a celebrity or a star athlete who makes antisemitic statements?”

Abdul-Jabbar’s answer:

There should never be a one-size-fits-all punishment, because everything depends on what is said and the reaction from the celebrity when called out. In Irving’s case, he refused to acknowledge the damage he was causing and went on to cause more. Sometimes, celebrities might say something harmful without realizing it, but when it’s pointed out are immediately apologetic. Then, nothing should happen. We all make mistakes, and we should be supportive of those who are willing to learn. I think it would be very helpful for sports organizations to offer presentations in critical thinking to their players. Too many players either didn’t learn this in college, or didn’t attend college where they might have learned it. In the end, this might save teams a lot of money and bad publicity, because it might eliminate some of the illogical prejudice being posted.

Before expressing any more of my own opinions on this subject, I am looking forward to reading and pondering whatever thoughts all of you have to offer. Do send me an email this week.

Apple’s Rotten Update

In Quartz, Zachary M. Seward argues that Apple “hobbled a crucial tool of dissent in China weeks before widespread protests broke out.” He explains:

Anti-government protests flared in several Chinese cities and on college campuses over the weekend. But the country’s most widespread show of public dissent in decades will have to manage without a crucial communication tool, because Apple restricted its use in China earlier this month. AirDrop, the file-sharing feature on iPhones and other Apple devices, has helped protestors in many authoritarian countries evade censorship. That’s because AirDrop relies on direct connections between phones, forming a local network of devices that don’t need the internet to communicate. People can opt into receiving AirDrops from anyone else with an iPhone nearby.

That changed on Nov. 9, when Apple released a new version of its mobile operating system … Rather than listing new features, as it often does, the company simply said, “This update includes bug fixes and security updates and is recommended for all users.” Hidden in the update was a change that only applies to iPhones sold in mainland China: AirDrop can only be set to receive messages from everyone for 10 minutes, before switching off. There’s no longer a way to keep the “everyone” setting on permanently on Chinese iPhones. The change, first noticed by Chinese readers of 9to5Mac, doesn’t apply anywhere else.

You Say You Want a Revolution?

Noah Millman reflects on examples of oppressive regimes being overthrown and comes to the gloomy conclusion that liberals in Iran and China are unlikely to triumph over their respective regimes:

The more I think about it, the harder it is for me to see how the Iranian or Chinese people succeed through popular protest alone. They are not challenging regimes that are thin and weak, but ones that are thick and entrenched, and fully supported by military and paramilitary organs. If they are increasingly inward-looking, oppressive and incompetent, that is in part because they have both taken dramatic steps in recent years to purge themselves of liberal or reformist elements in favor of lock-step loyalists, which leaves less room for the kind of factional split that could give a popular revolution crucial leverage inside the regime. Nor is either country on the brink of financial collapse. Iran is already massively sanctioned and yet continues to function, which has arguably increased the regime’s hold on the country rather than weakening it. China is far too large and prosperous to strangle from without. It’s not inappropriate to describe both regimes as somewhat Brezhnevite, but it’s worth remembering that Brezhnev’s Soviet Union did not collapse from its own contradictions, but successfully crushed liberal revolts in Czechoslovakia and Poland and sent troops to Afghanistan to prop up an unpopular communist regime there.

That’s not a happy conclusion for me to come to, nor is it for anyone who loves human freedom. It’s much more pleasant to believe that these oppressive regimes, having made catastrophic errors, are now about to face their just deserts. But politics is not a morality play, and oppressive and unpopular regimes—even ones whose poor decisions are steadily eroding their nations’ power and well-being—can last for a long time if they can keep enough key centers of power on their side. So far, that’s something both countries—certainly China, but Iran as well, at least so far—have managed to do.

I hope I’m wrong. I hope that people power can triumph. But from where I sit, the most likely scenario for a successful Iranian or Chinese revolution is for either country to start a major war with an adversary that they then go on to lose, badly. That’s the kind of mistake that can shatter your army or turn it against you, and if that happens the end of a regime can come quickly. It’s also a mistake that both countries have so far been wise enough to avoid making. Given the terrible human costs of any such war, we ought to hope that they continue to avoid it, even though it makes their odious regimes’ survival more likely.

Casinos Don’t Enrich Cities

Nicole Gelinas sets forth that argument in City Journal while arguing against relying on them to solve New York City’s economic woes:

Casinos don’t have much of an economic-multiplier effect for two reasons. First, the house always wins: casinos are extractive entertainment. People who lose money gambling have less to spend at competing attractions, such as restaurants or sports stadia …

Second, casinos do not create, on balance, high-paying jobs. Nationwide, the average gambling-industry worker earns $18 in mean hourly wages, federal data show—not much above New York’s statutory minimum wage of $15. Gambling dealers earn $32,450 annually; gambling managers earn $89,190. The average private-sector worker in Las Vegas, the nation’s gambling capital, earns just $992 weekly, below the national average of $1,116 … Unlike the typical New York banker or white-collar manager, the average casino worker does not command the personal spending power to support jobs across other industries. Nor does the casino worker earn enough to be a significant source of state or city tax revenue in a highly progressive state dependent on top earners...

Thus, casinos don’t save cities economically.

[Read: The case for building more housing]

Provocations of the Week

Kathryn Mangu-Ward makes “the case for space billionaires” at Reason.

And at The Permanent Problem, Brink Lindsey muses on the state of capitalism in a world without competition from alternative systems of economic organization. His take on the matter:

Since the fall of communism 30 years ago, capitalism for the first time in its existence lacks any competition from a rival system … Virtually the entire inhabitable surface of the globe has been claimed by territorially exclusive states using the same basic forms of governance. Some two-thirds of working-age people worldwide work for money income, most as wage employees of private business enterprises. The majority of people now live in cities constructed from the same building materials and shaped by the same architectural styles. Everywhere you can find people wearing the same kinds of clothing, eating the same food, driving the same cars, watching the same movies, and obsessing about the same media celebrities. For all of its history until recently, though, capitalism had to contend with actually existing alternatives. Capitalism emerged against the backdrop of aristocratic agrarianism, the legacy system that it gradually displaced and toppled. And well before the agrarian order breathed its last, a new rival arose in the form of the socialist movement. While World War I finally toppled the old agrarian power structures, it simultaneously brought socialism to power in Russia …

Capitalism’s coexistence with rival systems afforded it opportunities, and subjected it to pressures, that enhanced its powers as an engine of social progress. When industrialization was first taking off, capitalists took advantage of the huge “reserve army of labor” in the peasantry to keep wages hovering at subsistence levels. And when industrializing economies were beset with periodic crises and slumps, the capitalist system could avoid chaos because the countryside acted as a kind of informal social welfare system, absorbing displaced workers temporarily until demand for their services recovered …

Competition with socialism forced the adoption of major institutional innovations that made it possible for the system to survive that bumpy ride intact. Specifically, the advanced capitalist economies were able to avoid socialist revolution only by absorbing significant amounts of the socialist program … Capitalism’s social-democratic makeover preserved the fundamental market order while introducing unionization to strengthen workers’ bargaining power and social insurance to soften the market’s downsides. This partial co-optation of socialism rejected the doctrine’s fundamental error (i.e., radical hostility to markets) while internalizing its key insight—that the existing rules of economic life, far from being natural and necessary, are conventions that can be altered to improve society’s overall functioning. Success often brings new difficulties in its wake, and it seems to me that capitalism’s elimination of all rivals presents a genuine problem.

Teen who allegedly confessed on Instagram to killing girl told police it was an accident

CNN

www.cnn.com › 2022 › 11 › 29 › us › joshua-cooper-pennsylvania-instagram-killing-complaint › index.html

This story seems to be about:

The 16-year-old Pennsylvania boy who allegedly confessed over Instagram video chat to killing a young girl told police when he was taken into custody that "it was an accident," according to a criminal complaint.

Fake Facebook and Instagram accounts promoting US interests had ties to US military, Meta says

CNN

www.cnn.com › 2022 › 11 › 22 › politics › meta-report-fake-accounts-us-military › index.html

People "associated with the US military" were likely behind a network of phony Facebook and Instagram accounts that promoted US interests abroad by targeting audiences in Afghanistan and Central Asia, Facebook parent firm Meta said Tuesday.

Fashion designer Raf Simons shutters his influential label

CNN

www.cnn.com › style › article › raf-simons-closes-label › index.html

Belgian fashion designer Raf Simons will shutter his eponymous fashion brand after 27 years, he announced on Monday. Simons shared the unexpected news on Instagram, writing that his label's recently unveiled spring-summer 2023 collection would be its last.

Your Home Belongs to Renovation TV

The Atlantic

www.theatlantic.com › technology › archive › 2022 › 11 › lifestyle-media-home-improvement-trends-obsession › 672168

In the new Netflix horror series The Watcher, which follows a family as a stalker turns their new suburban dream home into a nightmare, the first boogeyman the viewer meets is the home’s carrara-marble countertops. The house is, by all indicators, an impeccable domestic fantasy at the time of purchase, and its new owners had to empty their savings and investment accounts to fend off rival bidders and afford the final price. But the family finds the house’s gleaming white Italian counters so offensive—so five years ago—that they take out an additional loan in order to remove them immediately.

The series edges into absurdity—in a bit of inspired casting, Jennifer Coolidge plays an aggressively divorced, Mercedes-driving New Jersey real-estate agent—but the family’s immediate desire to renovate an already lovely home is played completely straight. And for good reason: Real people do this all the time now. They do it on instructional HGTV shows, on social media, in publications such as Domino and Dwell and Architectural Digest. On real-estate TV, brokers and buyers wince and gag over dark cabinets and high-shine brass light fixtures and white appliances, all relics of trends past. Houses with idiosyncrasies or personality—or even just somewhat dated but easily changed design flourishes, such as a red accent wall—are mocked relentlessly, only to be turned into pristine, camera-ready monuments to sterility. Often, the transformations involve explicit calculations about how much has theoretically been added to a home’s potential market value.

Some of the dwellings featured do genuinely need some repairs and improvements in order to be livable, comfortable homes, but the messages about aesthetic trends and social acceptability that come with all this renovating go far deeper than that. Reno-media is incredibly popular—HGTV is regularly a top-five cable channel—and its growing popularity has coincided with a huge increase in actual renovations. In the 1990s, American homeowners spent an average of more than $90 billion annually on remodeling their homes. By 2020, it was more than $400 billion. For homeowners, pressure to keep up with the Joneses has reached a logical extreme. Everywhere you look, there are new reasons to be unhappy with your house, and new trends you can follow to fix it.

A home plays two essential roles for many people: It’s the place you live your day-to-day life, and it’s the single most important asset you’ll ever have. Housing has served these dual purposes for much of the country’s history, but over the past 50 years in particular, as rising home values have far outpaced wage growth, Americans have begun to stake their financial future even more heavily on their home. If you’re one of the nearly two-thirds of adults in this country who own a home, it’s pretty likely that its potential sale price is a major factor in your long-term financial stability, even if you don’t plan to sell anytime soon.

In theory, renovation is a way to safeguard this stability. America doesn’t build enough new homes to keep up with housing demands, and the homes constructed during the building booms of decades past need to be maintained, even if their owners have no desire to make any aesthetic changes at all. Old houses spring leaks. Mice get in and chew up wiring. Vinyl flooring and laminate countertops chip and peel. So people get to work, learning about tiling options for kitchen backsplashes and figuring out which walls are load-bearing and sifting through an endless sea of contractors to keep their house up to date, hopefully appealing to someday-future buyers in the process.

In this context, the creation and growth of businesses such as HGTV, Dwell, and Home Depot during the 1990s and 2000s makes perfect sense: By then, many of the homes built during the postwar suburban expansion of the 1950s and ’60s had been sold off to new owners, and they needed a little work. While you’re opening up walls and ripping up floors, why not make some other improvements? Over time, popular shows such as Fixer Upper and Property Brothers have pushed this ethos to an extreme, doing the math on-screen to show viewers how much money some strategic renovations can theoretically make them in the long run. Anyone who dips even a toe into the home-renovation market will quickly encounter assurances that, say, an open kitchen or spa-like bathroom won’t just pay for itself but may very well turn a profit on your investment by maximizing your property value. If your home is the financial bedrock on which your life is built, then not making these changes is just leaving money on the table. If you don’t do it, a house flipper will, and they’ll make all the profit. Don’t you want to be able to retire?

If you buy at the right price, make the right crowd-pleasing changes, keep your budget low, and get a little lucky, some renovations really will pay for themselves and beyond—this is the entire principle on which flipping functions. But thanks in large part to the ubiquity of shelter media, this way of thinking about our homes now animates many people’s behavior regardless of whether or not they have a desire to quickly sell their home. Julia Miller, an interior designer who owns Yond Interiors in Minneapolis, told me that her middle-income clients almost always choose to renovate because they have saved up money to address real, functional problems in their home, and making aesthetic changes at the same time is a two-birds-with-one-stone situation. While the contractors are there and the house is a barely livable mess and the money has been saved up for what is usually a once-in-a-lifetime project for these clients, they want to make the most of it. The process, however, can be painful. These clients tend to consume a ton of home-design media and become overwhelmed by trying to make all of the perfect decisions, Miller said, eventually losing sight of whether they even like the of-the-moment updates they’re requesting. “They see too much, and then they also don’t trust themselves,” she told me. “A lot of them come to us feeling paralyzed.”

Miller’s wealthier clients have it easier. They’re more likely to renovate just because they want their home to be fully aligned with their personal preferences, she said, and the money required to make that happen isn’t a major concern for them—and neither is fully recouping that investment at resale. But with less wealthy clients, Miller told me that a significant part of her job is encouraging them to step away from HGTV, Instagram, and Pinterest in order to figure out what it is they might actually enjoy once they have to live in the home they’re creating.

New research has begun to suggest that the connection that Miller has noticed between these clients and shelter media isn’t just anecdotal. Annetta Grant, a professor at Bucknell University who studies the home-renovation market, recently co-authored an ethnography on how home-reno media has changed people’s relationship to their home. She and her fellow researcher, Jay Handelman, conducted extensive interviews with 17 people in the process of renovating their home, attended a consumer-renovation expo, interviewed renovation-service providers, and consumed dozens of hours and hundreds of pages of home-reno media. The primary finding was that home-renovation media seems to make people feel uneasy in their own home. In academic terms, the phenomenon is known as dysplacement, or a sense that our long-held understanding of what our home means to us is out of sync with what changing market forces have decided a home should be. In layman’s terms, it’s the unsettling feeling that the home you’ve made for yourself is no longer a good one, and that other people think less of you for it.

People are highly sensitive to feeling out-of-sorts in their home, Grant told me. This is one of the reasons that moving and unpacking are so stressful, and that accumulating unnecessary clutter feels so bothersome. Americans have long understood successful home ownership and homemaking as indicative of personal success and character. Beginning in the postwar era, “that was largely achieved by customizing your home to the personality that you wanted to portray,” Grant said. Even in the tract-home developments of mid-century suburbs, the insides of houses tended to be idiosyncratic, with liberal use of color and texture and pattern—on the walls, the floors, the furniture. Some of those choices were the result of trends, of course, but there was plenty of variety within those parameters, and people tended to pick things they liked and stick with them. Can you imagine your grandmother worrying that her decades-old chintz curtains or Harvest Gold appliances were outdated?

Now, however, “personalization is being ripped out of people’s homes” in favor of market-pleasing standardization, Grant said. In her interviews with homeowners doing renovations, Grant said that people expressed embarrassment at having friends over to their outdated home, so much so that they’d avoid hosting their book club or planning parties—precisely the kinds of happy occasions that your home is supposed to be for. Others worried that if they spent money on creating the home they actually wanted—if they sacrificed a bedroom or bathroom to change a layout, for example, or didn’t knock down enough walls to create an open plan—they’d be penalized by buyers down the line. This is a type of worry common enough that it was recently the subject of a New York Times real-estate advice column.

Lifestyle media that defines successful homemaking has been around for generations, but the speed with which trends and expectations now shift, combined with the huge scale of changes that are now expected from homeowners, is something fundamentally new. The goal of this media apparatus, Grant said, isn’t to provide knowledge and inspiration for people improving the country’s aging housing stock but to keep people engaged in a process of constant updating—discarding old furniture and fixtures and appliances and buying new ones in much the way many people now cycle through an endless stream of fast-fashion pieces, trying to live up to standards that they can never quite pin down, and therefore never quite satisfy. If you’re required to plan your financial future and your most private spaces around how much strangers might be willing to pay for your home one day, then your home isn’t really yours, even if you’re the one with the keys right now. You may own it, but so do lifestyle media and the housing market.

Why Everything in Tech Seems to Be Collapsing at Once

The Atlantic

www.theatlantic.com › newsletters › archive › 2022 › 11 › tech-industry-mass-layoffs-recession-twitter › 672150

This is Work in Progress, a newsletter by Derek Thompson about work, technology, and how to solve some of America’s biggest problems. Sign up here to get it every week.

The tech industry seems to be in a recession. Although overall unemployment is still very low, just about every major tech company—including Amazon, Meta, Snap, Stripe, Coinbase, Twitter, Robinhood, and Intel—has announced double-digit percentage-point layoffs in the past few months. The stock valuations for many of these companies have fallen more than 50 percent in the past year.

Watching this surge of mass layoffs in big tech companies, plus the lurid chaos unfolding at Twitter over the past few weeks and the spectacular ongoing implosion of crypto, the big question on my mind is: Why is it all happening at once?

The simple, and possibly simplified, answer to this question is: It’s the interest rates, stupid.

The period after the Great Recession was defined by a weak economy with low aggregate demand and low interest rates. This created the perfect conditions for an era of endless cash that venture capitalists, seeking high rates of return, poured into low-marginal-cost software companies. As smartphone penetration rose in the U.S. and around the world, the app revolution took off. Social-media and consumer-tech companies became some of the richest and fastest-growing in the world. Hollywood went streaming, content went digital, and the services economy became intermediated by smartphones.

Then came the surge of post-pandemic inflation. Rising interest rates have meant the end of easy money. The Millennial Consumer Subsidy—my term for VCs splitting the bill with consumers to grow their companies—has come to a close. As the cost of risk has gone up, venture funding has gone down, and companies have had to cut costs, raise prices, or both. Meanwhile the narrative in markets has flipped from growth to profits, and valuations for tech companies have crashed.

The inflation explanation is fairly technical. I’ve got another story that’s a little bit harder to prove. It goes something like this: The tech industry is experiencing a midlife crisis.

After using its metaphorical youth to experiment with social media and consumer tech through boundless investment and endless optimizations and A/B tests, many tech executives and investors today feel like they’ve essentially solved the most interesting and important problems of basic digitization. This is not just my opinion: Four years ago, the tech analyst Ben Evans observed that software had scaled the mountain of advertising and media and connected the world, and tech was looking to climb new mountains and find new challenges. One chapter was closing, and the most prominent tech executives and investors were looking for the next story.

Executives of the largest tech firms have for years been shifting resources toward new ventures with uncertain returns. Amazon recently employed more than 10,000 people to work on its AI product, Alexa. (Jeff Bezos stepped away from the company he founded to work on rocket ships.) At Meta—the parent company of Facebook, Instagram, and WhatsApp—Reality Labs, the division working to build a metaverse, has about 15,000 employees. Apple reportedly has 3,000 people working on an augmented-reality headset, and thousands more are working on Google’s voice assistant. At the same time, the venture-capital community has been looking for its own moonshot, and many investors have found one (or, at least, have wanted people to believe that they have) in crypto. VCs have reportedly bet dozens of billions of dollars in the space, even though, for all the bluster and investment, it mostly remains a technology in search of a use case beyond betting money on tokens that cash out in dollars. Meanwhile, in what may be a literal midlife crisis, Elon Musk, a car and rocket executive, has installed himself at the helm of a digital delivery mechanism for news outrage with, at best, a chaotic plan for resurrecting its business.

It would be unfair to suggest that all of these moves are the emotional equivalent of a 52-year-old man dying his hair and trading the minivan for a Corvette. Companies going big and spending lots of money on important and difficult problems with uncertain solutions is cool, in a way. But at the moment, a lot of these bets look half-baked, catastrophically expensive, or outright fraudulent.  

These explanations—the macroeconomic one and the psychodynamic one—intersect. The tech industry, which had perfected the art of optimizing digital spaces for engagement and ad placement, was prepared to invest deeply in the next adventure. But it’s gotten smacked by post-pandemic inflation and rising interest rates, which has made this pivot harder to execute. The result is the current news: mass layoffs across companies that just a few years ago seemed utterly unstoppable.

One mistake that a journalist can make in observing these trends is to assume that, because the software-based tech industry seems to be struggling now, things will stay like this forever. More likely, we are in an intermission between technological epochs. We’ve mostly passed through the browser era, the social-media era, and the smartphone-app-economy era. But in the past few months, the explosion of artificial-intelligence programs suggests that something quite spectacular and possibly a little terrifying is on the horizon. Ten years from now, looking back on the 2022 tech recession, we may say that this moment was a paroxysm of scandals and layoffs between two discrete movements.

Want to discuss the future of business, technology, and the abundance agenda? Join Derek Thompson and other experts for The Atlantic’s first Progress Summit in Los Angeles on December 13. Free virtual and in-person passes available here.

Beware the 'Storification' of the Internet

The Atlantic

www.theatlantic.com › books › archive › 2022 › 11 › seduced-by-story-peter-brooks-book-review › 672135

Recently, during an ad break in the episode of Frasier I was watching, two commercials played back to back. The first, for United, wanted to tell me “the story of an airline,” which the commercial characterized as sci-fi, romance, and adventure, starring 80,000 “hero characters” otherwise known as employees. The second ad, for ESPN, argued that college football has everything that “makes for a great story”: drama, action, “an opening that sucks you in, a middle that won’t let you go, and a mind-blowing, nail-biting ending.”

There is a growing trend in American culture of what the literary theorist Peter Brooks calls “storification.” Since the turn of the millennium, he argues in his new book, Seduced by Story: The Use and Abuse of Narrative, we’ve relied too heavily on storytelling conventions to understand the world around us, which has resulted in a “narrative takeover of reality” that affects nearly every form of communication—including the way doctors interact with patients, how financial reports are written, and the branding that corporations use to present themselves to consumers. Meanwhile, other modes of expression, interpretation, and comprehension, such as analysis and argument, have fallen to the wayside.

The danger of this arises when the public fails to understand that many of these stories are constructed through deliberate choices and omissions. Enron, for instance, duped people because it was “built uniquely on stories—fictions, in fact … that generated stories of impending great wealth,” Brooks writes. Other recent scams, like those pulled off by Purdue Pharma, NXIVM, and Anna Delvey, succeeded because people fell for tales the perpetrators spun. In other words, we could all benefit from a lesson in close reading and a dose of skepticism.

[Read: When narrative matters more than fact]

Brooks’s extensive body of scholarship, including his foundational 1984 book, Reading for the Plot: Design and Intention in Narrative, helped pioneer our understanding of how narrative functions in literature and in life. As such, he knows that his critique of the tendency to narrativize isn’t exactly a new one. Joan Didion came to a similar conclusion in her 1979 essay “The White Album,” summed up by the oft-repeated dictum “We tell ourselves stories in order to live.” (Brooks’s version is a bit bleaker: “We have fictions in order not to die of the forlornness of our condition in the world.”) In times of turmoil, we search most desperately for the familiar hallmarks of storytelling: clearly defined heroes and villains, motives, and stakes.

But there’s a powerful narrative force at work today that Brooks, 84, understandably fails to consider in Seduced by Story: the internet. In doing so, he doesn’t just badly circumscribe his argument; he misses how the ability to read critically and recognize the way a narrative is constructed is even more important now than when the novel, the subject of most of his focus, reigned as one of the most prominent forms of media. His sole mentions of the internet—vague acknowledgments that “Twitter and the meme dominate the presentation of reality” and that ours is an “era of fake news and Facebook”—fail to grasp that on the internet especially, more attentive, analytical reading is essential.

If amid social upheaval we use stories to make sense of our world, then on the internet we use stories to make sense of ourselves. The filmmaker Bo Burnham, who grew up with and on the internet, is one of the sharpest chroniclers of how digital media shape our interior lives. In an interview for his 2018 movie, Eighth Grade, about a 13-year-old girl coming of age online, Burnham said that when it comes to the internet, talking heads focus too much on social trends and political threats rather than on the “subtler,” less perceptible changes it’s causing within individuals. “There’s something interior, something that’s actually changing our own view of ourselves,” he said. “We really do spend so much time building narrative for ourselves, and I sense with people that there was a real pressure to view one’s life as something like a movie.”

Just look at TikTok, where storytelling has become a lingua franca. In videos on the app, users encourage one another to “do it for the plot” or to claim their “main-character energy”—and, crucially, to film the results. One TikTok tutorial shows users how to edit a video to “make your life seem like a movie.” Story-speak is often used for levity: “I really hate when people call all the things I’ve gone through ‘trauma,’” one 19-year-old says in a tongue-in-cheek clip. “I prefer to call it ‘lore.’” But it also provides language for hard-to-articulate feelings: In another video, a forlorn teen stares into the camera above the text, “i know i’m a side character, i have no purpose except to sit and wait for my next scene.”

Here, and in most other corners of the internet, narrative taxonomy prevails. We’re telling ourselves stories in order to live, yes, but we’re also turning ourselves into stories in order to live. Amid the shapeless, endless internet—which Burnham describes as “a little bit of everything all of the time”—the tidy language of story appeals, helping to structure our experiences on- and offline. Making ourselves legible to others is, in essence, the mandate of social media. We are encouraged to create a brand and cultivate an aesthetic, to share inspiring anecdotes on LinkedIn and project authenticity on BeReal. On Instagram, “Stories” allow users to broadcast moments and experiences to their followers, and it’s tempting, one Mashable article argued, to rewatch your own—to view your life in the third person, packaged and refracted through a camera lens. “What do we want more,” Burnham asks in his 2016 special, Make Happy, “than to lie in our bed at the end of the day and just watch our life as a satisfied audience member?”

Social media hinges on storytelling because telling stories is, in Brooks’s words, “a social act.” This isn’t inherently bad, but it’s vital to be aware of artifice and the spin we put on our lives in public. As narrators of our own lives, Brooks writes, “we must recognize the inadequacy of our narratives to solve our own and [others’] problems.” Pulling from Freudian psychoanalysis, Brooks concludes that telling stories should be a tool we use to understand ourselves better rather than a goal in and of itself.

He occasionally brushes up against other timely ideas. At one point, he cites the French philosopher Jean-François Lyotard, who argues that in our present postmodern era, the “grand narratives”—progress, liberation, salvation, etc.—that once sustained entire societies have lost their power. “We are left with many mini-narratives everywhere,” Brooks adds, “individual or collective and, in many cases, dominantly narcissistic and self-serving.” The fragmentation of what we perceive as real and true is indeed a pressing concern. What would Brooks make, for instance, of Atlantic contributor Charlie Warzel’s claim that 2017 was “the year that the internet destroyed our shared reality,” setting the stage for alternative facts and conspiracy theories? Unclear; Brooks drops the fascinating idea of “many mini-narratives everywhere” (a little bit of everything all of the time) as quickly as he introduces it.

[Read: How to put out democracy’s dumpster fire]

Brooks has delineated his lane—the novel—and is content to stay in it.  But many recent developments in the novel—the ever more common “trauma plot,” the “representation trap” befalling many Black fiction writers, the growing conflation of novels with morality tales—relate to how any story, regardless of the medium, can become freighted with undue political, representational, or moral weight. Although Brooks briefly worries about “inflated claims about [narrative’s] capacity to solve all personal and social issues” in the first chapter, it never comes up again in the many rich and rigorous close readings that follow.

It’s a shame that Brooks doesn’t see how broadly applicable his argument is. Today, stories have become ubiquitous, thanks in part to the internet’s democratization of storytelling—anyone can write or film their experiences and put them online. And “telling one’s story”—in a novel or a film, a Twitter thread or a TikTok video—has also become disproportionately valorized, often seen as a “brave” way to generate empathy and political change.

In his own way, Brooks bristles against this. In the second chapter of Seduced by Story, for instance, he discusses what he calls the “epistemology of narrative”—in other words, how do we know where a narrator’s knowledge comes from, or what his or her potential agenda might be? The question, which he applies to works by Faulkner and Diderot, felt especially pertinent to me as I watched the back-to-back ads that extolled the virtues of story. The many narratives that reach us through our screens demand the sort of scrutiny Brooks advocates for. A more critically minded and media-literate populace is the only antidote for a culture in thrall to a good tale.